threenine.io
API Gateway Pattern

API Gateway Pattern

using API gateways for communication between clients and applications

Gary WoodfineGary Woodfine Jun 10, 2023

The API Gateway pattern is a service that provides a single-entry point for certain groups of microservices. It's similar to the Facade Pattern from object-oriented design, but in this case, it's part of a distributed system. The API Gateway pattern is often conceptually linked to Backend for Frontend (BFF) because both patterns share a similar objectives, but they are actually two distinct patterns.

An API gateway sits between a client and a set of backend services, that acts as an API front-end, receiving API requests, enforcing throttling and security policies, passing requests to the back-end service, and then passing the response back to the requester. It serves as a single entry point for all client requests, simplifying the architecture and providing a range of benefits for managing and securing APIs.

An API Gateway is a server that is the single entry point into the system. The API Gateway encapsulates the internal system architecture and provides an API that is tailored to each client. It might have other responsibilities such as authentication, monitoring, load balancing, caching, request shaping and management, and static response handling.

The API Gateway is responsible for request routing, composition, and protocol translation. All requests from clients first go through the API Gateway. It then routes requests to the appropriate microservice. The API Gateway will often handle a request by invoking multiple microservices and aggregating the results. It can translate between web protocols such as HTTP and WebSocket and web‑unfriendly protocols that are used internally.

Why use an API Gateway?

The API Gateway can also provide each client with a custom API. It typically exposes a coarse‑grained API for mobile clients. It may expose a finer‑grained API for desktop clients to conserve bandwidth. It may provide an API tailored to the needs of each client. For example, some clients may require only a subset of the data that is available, so the API Gateway can retrieve and return only that data.

An API gateway manages the traffic between the client and the backend. This means it can handle requests and retrieve data and services, including routing, combining multiple API calls and enforcing policies

An API Gateway can offer multiple features. Depending on the product it might offer richer or simpler features, but the following are the most common features:

  • Authentication - The API Gateway can handle authentication for all incoming requests and pass the request to the services with a valid authentication token. This can simplify the development of authentication for all of the microservices. It can also prevent certain types of Denial of Service (DoS) attacks.
  • Backend - services only if the request is authenticated. This can simplify the development of authentication for all of the microservices. It can also prevent certain types of Denial of Service (DoS) attacks.
  • Load balancing - The API Gateway can act as a load balancer to evenly distribute requests across the back‑end services.
  • Caching - The API Gateway can cache backend responses. This can significantly improve performance metrics.
  • Monitoring - The API Gateway can provide monitoring and tracking of requests and responses. It can also do application logging and tracking.
  • Request Aggregation - The API Gateway can aggregate multiple requests into a single request. This can reduce chattiness and round trips.
  • Response Aggregation - The API Gateway can aggregate multiple responses into a single response. This can reduce chattiness and round trips.
  • Static Response Handling - The API Gateway can handle responses for API calls that do not require a back‑end service. For example, it can return a cached response.
  • Management - The API Gateway can provide a set of management functions. For example, it can provide an API browser that lists available APIs and their operations.
  • Protocol Translation - The API Gateway can translate between web protocols such as HTTP and WebSocket and web‑unfriendly protocols that are used internally.
  • Security - The API Gateway can provide an extra layer of security by encapsulating the internal system architecture and shielding the internal services from the public-facing API. It can also provide some monitoring of the incoming requests.
API Development

Need help with API Development?

threenine.co.uk can help you with your API Development. We have experience in building APIs for a range of clients, from small startups to large multi-national enterprises. We can help you with your API Development, whether you need a new API or need to improve an existing one.

Benefits of an API Gateway

A major benefit of using an API Gateway is that it encapsulates the internal structure of the application. Rather than having to invoke specific services, clients simply talk to the gateway. The API Gateway provides each kind of client with a specific API. This reduces the number of round trips between the client and application. It also simplifies the client code.

A client that talks to multiple microservices has to deal with the additional complexity of orchestrating multiple remote calls. A client that talks to an API Gateway only has to do a single round trip to the API Gateway. This reduces latency and bandwidth usage. It also helps to prevent denial-of-service attacks.

The API Gateway can also mask failures in the backend services by returning cached or default data. This can reduce the number of calls a client makes to backend services. The API Gateway can handle requests that require server-side processing or load balancing. The API Gateway can provide each application with API features that are suited to its specific needs. For example, some applications may require low latency, while others may need high availability.

Disadvantages of an API Gateway

The API Gateway pattern has some disadvantages and must be considered when designing your API Development strategy.

  • Increased complexity - the API Gateway is yet another moving part that must be developed, deployed, and managed.
  • Increased response time - due to the additional network hop through the API Gateway - however, for most applications, the cost of an extra roundtrip is insignificant.
  • Increased development effort - developers must implement and test the API Gateway for each type of client.
  • Increased latency - the API Gateway is a single point of failure for all clients. If it fails, all clients fail.
  • Increased security risk - the API Gateway is the entry point for all clients. A security breach in the API Gateway has much more impact than a breach in a single application.
  • Increased cost - the API Gateway is yet another piece of infrastructure that must be developed, deployed, and managed.
  • Increased operational complexity - the API Gateway must be included in the operational environment, which increases operational complexity.
  • Increased operational risk - the API Gateway is yet another moving part that can fail. This increases operational risk.
  • Increased operational cost - the API Gateway must be developed, deployed, and managed, which increases operational cost.
  • Increased operational latency - the API Gateway is a single point of failure for all clients. If it fails, all clients fail.
  • Increased operational security risk - the API Gateway is the entry point for all clients. A security breach in the API Gateway has much more impact than a breach in a single application.

Types of API Gateway

There are two main types of API gateways to choose from: cloud-based API gateways and on-premise API gateways. You can also opt for a hybrid solution, where your API gateway provider hosts the API management layer while your edge gateways are deployed on your infrastructure.

Cloud-based API Gateway

A cloud-based API gateway can give you a head-start, as you don’t have to worry about infrastructure headaches. Typically these are made available by various Cloud Providers such as Digital Ocean, Netlify, AWS, Azure and GCP. These API Gateways will typically help you to implement and leverage several benefits

  • API request management: intercepting API requests, API gateways can combine, reformat, or otherwise manipulate both requests and the resulting response. This is useful if clients "say" one thing when calling an API but your microservices need to "hear" something different in order to respond. In this case, the API gateway serves essentially as a translation layer for API calls.
  • Rate limiting: API gateways can "throttle" or rate-limit incoming requests, which means restricting the number of requests that clients can make in a given timeframe. Rate-limiting helps mitigate security abuse. It also protects against the risk that buggy or poorly managed clients will overwhelm applications by making large numbers of repeated, unnecessary requests.
  • Load balancing: Although API gateways do more than just load balancing, the ability to balance load by distributing traffic across multiple application instances or microservices is one of their features.
  • Monitoring and observability: API gateways can monitor and log API requests, providing the data necessary to drive observability.
  • Security: API gateways can also enforce security rules. For example, they could block malicious requests to prevent DDoS attacks.

On premise API Gateways

On-premise API Gateways are API Gateway solutions that are deployed and managed within an organization's own infrastructure, rather than being hosted in the cloud. This approach offers several benefits and considerations for organizations looking to maintain control over their API management.

Here's a detailed look at on-premise API Gateways:

  1. Control and Security: One of the primary advantages of on-premise API Gateways is the level of control and security they offer. Organizations can implement their own security policies, access controls, and compliance measures, ensuring that their data and services are protected according to their specific requirements.
  2. Customization: On-premise solutions allow for extensive customization to meet the unique needs of an organization. This includes tailoring the gateway to specific business processes, integrating with existing systems, and adapting to specific performance and scalability requirements.
  3. Data Privacy: By keeping the API Gateway on-premise, organizations can ensure that their data remains within their own infrastructure, complying with data residency and privacy regulations. This is particularly important for industries with stringent data protection requirements, such as finance and healthcare.
  4. Performance and Latency: On-premise API Gateways can offer lower latency and improved performance compared to cloud-based solutions, as the data does not need to traverse the internet. This is crucial for applications requiring real-time data processing and low-latency responses.
  5. Cost Management: While the initial investment in hardware and infrastructure can be significant, on-premise solutions can offer long-term cost savings, especially for organizations with high API traffic or specific performance requirements that would incur high costs in a cloud environment.
  6. Integration with Legacy Systems: On-premise API Gateways can be more easily integrated with legacy systems and internal databases, allowing organizations to modernize their APIs without completely overhauling their existing infrastructure.
  7. Vendor Lock-in: Organizations can avoid vendor lock-in by choosing an on-premise solution, as they have full control over the technology stack and can switch vendors or migrate to a different solution if needed.
  8. Maintenance and Support: On-premise solutions require organizations to handle their own maintenance, updates, and support. This includes ensuring the gateway is always available, performing regular updates, and troubleshooting any issues that arise.
  9. Scalability: While on-premise solutions offer control, they may require additional investment in hardware and infrastructure to scale with growing API traffic. This can be more complex and costly compared to the auto-scaling features offered by cloud-based solutions.
  10. Disaster Recovery and High Availability: Organizations need to implement their own disaster recovery and high availability strategies for on-premise API Gateways, ensuring that the gateway remains operational during failures or maintenance.

on-premise API Gateways provide organizations with a high degree of control, customization, and security, making them suitable for environments with specific compliance, performance, or integration requirements. However, they also require a significant investment in infrastructure, maintenance, and support to ensure optimal performance and availability.