Blog

  • How does API Management help?

    By adding multiple APIs, functions, and other services to API Management, you can assemble those components into an integrated product that presents a single entry point to client applications. Composing an API using API Management has advantages that include:

    • Client apps are coupled to the API expressing business logic, not the underlying technical implementation with individual microservices. You can change the location and definition of the services without necessarily reconfiguring or updating the client apps.
    • API Management acts as an intermediary. It forwards requests to the right microservice regardless of location and returns responses to users. Users never see the different URIs where microservices are hosted.
    • You can use API Management policies to enforce consistent rules on all microservices in the product. For example, you can transform all XML responses into JSON, if that is your preferred format.
    • Policies also enable you to enforce consistent security requirements.

    API Management also includes helpful tools – you can test each microservice and its operations to ensure that they behave in accordance with your requirements. You can also monitor the behavior and performance of deployed services.

    Azure API Management supports importing Azure Function Apps as new APIs or appending them to existing APIs. The process automatically generates a host key in the Azure Function App, which is then assigned to a named value in Azure API Management.

  • The benefits of using Azure API Management to compose your API

    Microservices architectures can be difficult to manage. For example, you might rely on separate teams implementing cross-cutting requirements, such as security, in a consistent way.

    In the online store, your developer teams built the product details and order details microservices at different host URLs. Also, the order details service responds by using XML. You want to ensure that all responses are in JSON format to make things easier for the client app developers.

    In this unit, you learn about the features of API Management. You can use these features to integrate different microservices and present them to client applications with consistent behavior at a single URL.

    Microservices architecture challenges

    The microservices approach to architecture creates a modular application in which each part is loosely coupled to the others. Independent deployment of services reduces the effect of any bugs that might make it through testing into production. This modular approach makes it easier to roll back to a stable version. Also, you can create small, autonomous teams of developers for each microservice. This division fits well with modern Agile practices.

    However, microservices architectures can also present challenges, such as:

    • Client apps are coupled to microservices. If you want to change the location or definition of the microservice, you might have to reconfigure or update the client app.
    • Each microservice can be presented under different domain names or IP addresses. This presentation can give an impression of inconsistency to users and can negatively affect your branding.
    • It can be difficult to enforce consistent API rules and standards across all microservices. For example, one team might prefer to respond with XML and another might prefer JSON.
    • You’re reliant on individual teams to implement security in their microservice correctly. It’s difficult to impose these requirements centrally.
  • Azure Functions

    Azure Functions is a service that enables serverless architectures in Azure. You can write functions without worrying about the supporting infrastructure in many different languages, including C#, Java, JavaScript, PowerShell, and Python. You can use libraries from NuGet and the Node Package Manager (npm). You can also authenticate users with the OAuth standard from providers such as Active Directory, Facebook, Google, and Microsoft Account.

    When you write a function, you choose a template to use depending on how you want to trigger your code. For example, if you want to execute the function in response to an HTTP request, use the HTTPTrigger template. You can use other templates to execute when there are new messages in a queue, a Blob storage container, or on a predefined schedule.

    When you use Azure Functions in a Consumption Plan, you’re charged only for the time that your code runs.

    Azure API Management

    Azure API Management is a fully managed cloud service that you can use to publish, secure, transform, maintain, and monitor APIs. It helps organizations publish APIs to external, partner, and internal developers to unlock the potential of their data and services. API Management handles all the tasks involved in mediating API calls including request authentication and authorization, rate limit and quota enforcement, request and response transformation, logging and tracing, and API version management. API Management enables you to create and manage modern API gateways for existing backend services no matter where they’re hosted.

    Because you can publish Azure Functions through API Management, you can use them to implement a microservices architecture; each function implements a microservice. By adding several functions to a single API Management product, you can build those microservices into an integrated distributed application. Once the application is built, you can use API Management policies to implement caching or ensure security requirements.

    API Management Consumption Tier

    When you choose a usage plan for API Management, you can choose the consumption tier. The consumption tier is especially suited to microservice-based architectures and event-driven systems. For example, it would be a great choice for our online store web API.

    The consumption tier uses the same underlying service components as the previous tiers, but employs an entirely different architecture based on shared, dynamically allocated resources. The consumption tier aligns perfectly with serverless computing models. There’s no infrastructure to manage and no idle capacity. It provides high-availability, automatic scaling, and usage-based pricing. All of which make the consumption tier an especially good choice for solutions that involve exposing serverless resources as APIs.

  • Create a new API in API Management from a function app

    The Azure API Management service enables you to construct an API from a set of disparate microservices.

    In your online store, each part of the application is implemented as a microservice – one for the product details, one for order details, and so on. A separate team manages each microservice, and each team uses continuous development and delivery to update and deploy their code regularly. You want to find a way to assemble these microservices into a single product and then manage that product centrally.

    In this unit, you learn how Azure API Management is useful in a serverless architecture by building single APIs from individual microservices.

    Serverless architecture and microservices

    Microservices are a popular approach to the architecture of distributed applications. When you build an application as a collection of microservices, you create many different small services. Each service has a defined domain of responsibility and is developed, deployed, and scaled independently. This modular architecture results in an application that is easier to understand, improve, and test. It also makes continuous delivery easier, because you change only a small part of the whole application when you deploy a microservice.

    Another complementary trend in distributed software development is serverless architecture. In this approach, a host organization publishes a set of services that developers can use to run their code. The developers don’t have to concern themselves with the supporting hardware, operating systems, underlying software, and other infrastructure. Instead, the code runs in stateless computing resources triggered by requests. Costs are only incurred when the services execute, so you don’t pay much for services that are rarely used.

    Azure Functions

    Azure Functions is a service that enables serverless architectures in Azure. You can write functions without worrying about the supporting infrastructure in many different languages, including C#, Java, JavaScript, PowerShell, and Python. You can use libraries from NuGet and the Node Package Manager (npm). You can also authenticate users with the OAuth standard from providers such as Active Directory, Facebook, Google, and Microsoft Account.

    When you write a function, you choose a template to use depending on how you want to trigger your code. For example, if you want to execute the function in response to an HTTP request, use the HTTPTrigger template. You can use other templates to execute when there are new messages in a queue, a Blob storage container, or on a predefined schedule.

    When you use Azure Functions in a Consumption Plan, you’re charged only for the time that your code runs.

  • Throttle API requests

    It’s common to find that a few users overuse an API. Sometimes, an API is overused to such an extent that you incur extra costs or that responsiveness to other users is reduced. You can use throttling (rate limiting) to help protect API endpoints by restricting the number of times an API can be called within a specified period.

    The Census API, for example, is distributed to lots of government agencies, so the number of calls to the API can become significant. By applying a rate limit policy, we can enable a quick response to all requests so that it isn’t possible for a single client to use all the resources for the Census API.

    In this unit, you learn how to use API Management policies to impose two types of throttling.

    Limit by subscription throttling

    Subscription throttling allows you to set the rate limits by a specific API operation. It doesn’t discriminate by the client. Instead, every request to the API or the specified operation is throttled in the same way. Using our Census API example, we could use subscription throttling to limit the number of times any of the APIs are called within a certain period. This configuration would result in clients receiving a 429 error when that limit was reached. The problem with this type of throttling is that it allows one client to use up all the requests before another client can use it.

    For example, the following code demonstrates an example configuration that applies to all API operations. The limit is set to three calls per 15-second period:

    XMLCopy

    <rate-limit calls="3" renewal-period="15" />
    

    Alternatively, this configuration can be used to target a particular API operation:

    XMLCopy

    <rate-limit calls="number" renewal-period="seconds">
        <api name="API name" id="API id" calls="number" renewal-period="seconds" />
            <operation name="operation name" id="operation id" calls="number" renewal-period="seconds" />
        </api>
    </rate-limit>
    

    Limit by key throttling

    Key throttling allows you to configure different rate limits by any client request value. This type of throttling offers a better way of managing the rate limits as it applies the limit to a specified request key – often the client IP address. It gives every client equal bandwidth for calling the API:

    XMLCopy

    <rate-limit-by-key calls="number"
                       renewal-period="seconds"
                       increment-condition="condition"
                       counter-key="key value" />
    

    The following example configuration limits the rate limit by the IP address of a request. Here, the limit is set to 10 calls per 60-second period:

    XMLCopy

    <rate-limit-by-key calls="10"
                  renewal-period="60"
                  increment-condition="@(context.Response.StatusCode == 200)"
                  counter-key="@(context.Request.IpAddress)"/>
    
  • Mask URLs with a transformation policy

    Organizations might need to adjust the information that an API publishes at short notice. For example, to comply with a change of legislation or address a new security threat.

    The Census API example exposes details about the URL from which the API is being called. This information could allow a malicious user to attempt to access the census data by bypassing the API Management gateway and exposing a less secure endpoint. As lead developer, you want to mask these URLs within the response body of the API.

    Here, you learn how to use API Management policies that manipulate the content of API response headers and bodies.

    Why transform a response?

    The response body of an API call contains the data that is being requested. In the Census API, for example, the response body contains the JSON data for the respondents. You can also see how the body contains URL links to view individual people:

    Screenshot of a default HTTP response with the href value highlighted, showing an unmasked URL link.

    These links are based on the Census API endpoints and need to be masked to show the API Management URLs instead.

  • Remove technical information from API responses

    Any organization that publishes an API needs to make sure that users can access it securely and that malicious users can’t successfully attack it.

    Governments store much personal data regarding citizens. Census data reveals a lot about each citizen, and their life. This data could be exploited to harm people. It’s imperative that any data exposed through API endpoints are secured through modern standards.

    As the lead developer, you look at how to set up a secured API gateway, which protects the census data from unauthorized access. It also helps protect the endpoints from denial-of-service attacks.

    Azure API Management

    The Azure API Management service is hosted in the Azure cloud and is positioned between your APIs and the internet. An Azure API gateway is an instance of the Azure API Management service.

    Publishers of APIs use the Azure portal or other Azure tools to control how each API is exposed to consumers. For example, you might want some APIs to be freely available to developers, for demo purposes, and access to other APIs to be tightly controlled.

  • Use an external cache

    API Management instances usually have an internal cache, which is used to store prepared responses to requests. However, if you prefer, you can use a Redis-compatible external cache instead. One possible external cache system that you can use is the Azure Cache for Redis service.

    You might choose to use an external cache because:

    • You want to avoid the cache being cleared when the API Management service is updated.
    • You want to have greater control over the cache configuration than the internal cache allows.
    • You want to cache more data than can be stored in the internal cache.

    Another reason to configure an external cache is that you want to use caching with the Consumption pricing tier. 

  • Configure a caching policy

    Optimal API performance is essential to most organizations. By using a cache of compiled responses in Azure API Management, you can reduce the time an API takes to answer calls.

    Suppose there’s a need for the board gaming API to provide faster responses to requests. For example, users often request prices for various sizes of the board for games. API Management policies can accelerate responses by configuring a cache of prepared responses. When a request is received from a user, API Management checks to see if there’s an appropriate response in the cache already. If there is, that response can be sent to the user without building it again from the data source.

    Here, you learn how to configure such a cache.

    How to control the API Management cache

    To set up a cache, you use an outbound policy named cache-store to store responses. You also use an inbound policy named cache-lookup to check if there’s a cached response for the current request. You can see these two policies in the following example:

    XMLCopy

    <policies>
        <inbound>
            <base />
            <cache-lookup vary-by-developer="false" vary-by-developer-groups="false" downstream-caching-type="none" must-revalidate="true" caching-type="internal" />
        </inbound>
        <backend>
            <base />
        </backend>
        <outbound>
            <cache-store duration="60" />
            <base />
        </outbound>
        </on-error>
            <base />
        </on-error>
    </policies>
    

    It’s also possible to store individual values in the cache instead of a complete response. Use the cache-store-value policy to add the value, with an identifying key. Retrieve the value from the cache by using the cache-lookup-value policy. If you want to remove a value before it expires, use the cache-remove-value policy:

    XMLCopy

    <policies>
        <inbound>
            <cache-lookup-value key="12345"
                default-value="$0.00"
                variable-name="boardPrice"
                caching-type="internal" />
            <base />
        </inbound>
        <backend>
            <base />
        </backend>
        <outbound>
            <cache-store-value key="12345"
                value="$3.60"
                duration="3600"
                caching-type="internal" />
            <base />
        </outbound>
        </on-error>
            <base />
        </on-error>
    </policies>
    

    Use vary-by tags

    It’s important to ensure that, if you serve a response from the cache, it’s relevant to the original request. However, you also want to use the cache as much as possible. Suppose, for example, that the board games Stock Management API received a GET request to the following URL and cached the result:

    http://<boardgames.domain>/stock/api/product?partnumber=3416&customerid=1128

    This request is intended to check the stock levels for a product with part number 3416. The customer ID is used by a separate policy, and doesn’t alter the response. Subsequent requests for the same part number can be served from the cache, as long as the record doesn’t expire. So far, so good.

    Now suppose that a different customer requests the same product:

    http://<boardgames.domain>/stock/api/product?partnumber=3416&customerid=5238

    By default, the response can’t be served from the cache, because the customer ID is different.

    However, the developers point out that the customer ID doesn’t alter the response. It would be more efficient if requests for the same product from different customers could be returned from the cache. Customers would still see the correct information.

    To modify this default behavior, use the vary-by-query-parameter element within the <cache-lookup> policy:

    XMLCopy

    <policies>
        <inbound>
            <base />
            <cache-lookup vary-by-developer="false" vary-by-developer-groups="false" downstream-caching-type="none" must-revalidate="true" caching-type="internal">
                <vary-by-query-parameter>partnumber</vary-by-query-parameter>
            </cache-lookup>
        </inbound>
        <backend>
            <base />
        </backend>
        <outbound>
            <cache-store duration="60" />
            <base />
        </outbound>
        </on-error>
            <base />
        </on-error>
    </policies>
    

    With this policy, the cache stores and separates responses for each product, because they have different part numbers. The cache doesn’t store separate responses for each customer, because that query parameter isn’t listed.

    By default, Azure API Management doesn’t examine HTTP headers to determine whether a cached response is suitable for a given request. If a header can make a significant difference to a response, use the <vary-by-header> tag. Work with your developer team to understand how each API uses query parameters and headers so you can decide which vary-by tags to use in your policy.

    Within the <cache-lookup> tag, there’s also the vary-by-developer attribute, which is required and set to false by default. When this attribute is set to true, API Management examines the subscription key supplied with each request. It serves a response from the cache only if the original request had the same subscription key. Set this attribute to true when each user should see a different response for the same URL. If each user group should see a different response for the same URL, set the vary-by-developer-group attribute to true.

  • Knowledge check

    You’re planning API Management policies for your board games company. You have three APIs:

    • The Board Pricing API: You manufacture boards of various sizes for partner companies to use with their games. Those partners can use this API to request a price estimate for manufacturing boards of different sizes.
    • The Stock Management API: Your staff uses a mobile app that calls this API to determine the stock level of your company’s games.
    • The Sales API: The website uses this API to place orders from customers for your company’s games.

    You added the Stock Management API and the Sales API to an API Management product named Sales.

    For the Board Pricing API, you want to make sure that all responses are sent in XML, even though developers wrote some operations to generate JSON text. The mobile app expects responses in XML, and the website expects responses in JSON.