How to Stand Up, Secure, and Deploy a GraphQL API with Azure AD B2C

By: Ben Gavin | November 2, 2021


Companies have been deploying REST-based APIs for years, and it has become the de-facto standard approach for API development. Recently, however, GraphQL has risen as another approach to exposing data to consumers.

GraphQL describes itself as a “query language for APIs and a runtime for fulfilling those queries with your existing data. GraphQL provides a complete and understandable description of the data in your API, gives clients the power to ask for exactly what they need and nothing more, makes it easier to evolve APIs over time, and enables powerful developer tools.”

ASP.NET Core provides built-in infrastructure to host and secure REST-based APIs and has libraries available to do the same with GraphQL. At Core BTS, we do a lot of application development and modernization using the technologies I’ll be covering in this post.

This post summarizes the process of standing up an API using GraphQL.NET, GraphiQL and ASP.NET Core. I’ll show how to secure the API using Azure AD B2C, and how to build and deploy the solution to Azure using Azure DevOps pipelines. Finally, I’ll show you how to integrate the newly secured API with a React App to customize the user experience.

This post is a summary of my five-part series on the topic, and as such doesn’t delve into the details around each step in the process. Those details can be found in the blog series here. Familiarity with the concepts behind GraphQL, ASP.NET Core, and the Azure Portal for Azure B2C setup is helpful and assumed. All code referenced here and in the accompanying blog series is available on GitHub.

Architectural Overview

There’s a lot to cover, and to set the stage for the upcoming sections, here is a simplified version of the final solution architecture:

On the left of the graphic, we have our consumers, the React app, and GraphiQL front-end. The API itself is secured with Azure AD B2C, and the consumer applications interact with B2C to obtain tokens to present to the API. These tokens are validated by the API and access is provided to the applicable API resources. Hopefully, by the end of this summary, you’ll have a better idea of how all these parts fit together.

Building the API

A great place to start is by getting the API up and running (in this case) using a similar schema to many of the GraphQL samples available online. The API is built using ASP.NET Core, so begin by running the dotnet command to create a new project.

dotnet new web –name StarWars.API –output api

Once the project is in place, we add the various GraphQL components:

  • Schema – StarWarsSchema
  • Query(ies) – StarWarsQuery
  • Mutation(s) – StarWarsMutation
  • Graph Types – Retrieval (used as query and mutation results) and Input types (used for mutation input)

Every GraphQL API needs some data to serve, often from a traditional relational or document database. For this solution, a simple in-memory data store is used because data storage patterns are not the primary focus. This data store will also provide a simple mechanism for isolating user data once authentication and authorization have been added, but that will come later.

After adding the GraphQL components, the graphql-dotnet server and GraphiQL runtime can be linked with the ASP.NET Core infrastructure. These nuget packages provide processing for GraphQL queries, mutations, and types, and a modern execution and documentation UI to end users, respectively. Details on the specific steps involved can be found here. Once complete, the project is ready to run and presents the GraphiQL UI.

With the API is in place, it’s time to move on to getting the Azure AD B2C tenant configured.

Setting up Azure AD B2C

Azure Active Directory Business to Consumer (Azure AD B2C, or just B2C) is what will be used to Authenticate our users. Azure AD B2C allows an application to easily provide users with access to the application via their preferred social media provider login.

B2C Tenant Setup

All the setup for B2C happens in the Azure Portal, so we will start by creating the B2C tenant that will house the users for the application.

To use Azure AD B2C, the ‘Microsoft.AzureActiveDirectory’ resource provider must be registered in the target Azure Subscription (via the subscription’s “Resource Providers” blade in the Azure portal). Once the resource provider is registered, go back to the Azure Portal home screen and click “Create a resource”, then choose “Azure Active Directory B2C”, and walk through the wizard to get the tenant configured.

Once the tenant is set up, “switch into” the tenant before making any additional changes. Click the user information in the upper right corner of the portal, and then choose “Switch directory”.  The newly created B2C tenant should be in the “All Directories” list, find it and click the “Switch” button. The portal will reload within the new tenant – sometimes immediately into the B2C management overview. If it doesn’t, click “Home”, and then choose “Azure AD B2C” from the dashboard.

Configuring the tenant requires visiting three different areas, with a recommended fourth area set aside to assign company branding to the login experience. Start by linking the identity provider(s) which the application should support. Go to the “Identity Providers” blade and click on the provider(s) for the solution. Each provider setup is a little bit different, but all involve gathering details from the target identity provider. This is usually done through an admin / developer portal for the identity provider, followed by adding the details into the B2C provider setup dialog. The Azure portal has linked documentation for each provider with detailed instructions for linking that provider with B2C.

User Flows

After the identity providers are set up, user flows can be created. User Flows govern the user experience during sign up, sign in, forgot password, etc. Click the “User Flows” blade and add a user flow for at least “Sign-Up / Sign-In” using the “Recommended” options and adding the identity providers configured above to each flow. Once the flows are in place, the associated application registrations can be created for use by the API and GraphiQL front-end.

Application Registrations

Application Registrations can be created from the main B2C configuration pane. Click “App Registrations” and then “New Registration” to get started. Give the application a name (e.g. “Sample GraphQL API”, choose the “Accounts in any provider…” option, and then click “Register” to create the new registration. Once in the registration overview pane, select “Expose an API” to set the “App ID URI” (e.g. graphqldemoapi) and add at least one scope for the API (e.g. and character.write).

Most solutions will involve one or more applications accessing the API. To demo this, add a second application registration for the “Sample GraphiQL UI”, again with “Accounts in any provider…” For this application, rather than exposing an API, the application should have “API permissions” assigned. Click the “API permissions” blade, click “Add a permission”, select “My APIs”, and choose the API instance defined earlier – along with selecting any scopes that were added.

Even in summary form, there is a lot going on here so feel free to review the detailed walkthrough for specifics. At this point, the new Azure AD B2C tenant is ready to light up the security aspects of the API and GraphiQL UI.

Connect the API & GraphiQL to B2C

The diagram below walks through a request from the initial GET for /ui/graphiql through to the execution of a particular GraphQL query. The goal of this section is to enable this flow, and although it looks complex (and it is!), the implementation is made easier by leveraging much of the built-in ASP.NET Core and GraphQL.NET capabilities.

The general flow is:

  1. User accesses the GraphiQL front-end from a new browser window
  2. API detects that the user is not logged in and redirects to the Azure AD B2C infrastructure sign-in flow
  3. The user’s browser initiates the sign-in flow with B2C
  4. B2C displays a selection of configured identity providers for the user to select and redirects to the selected provider for authentication
  5. The identity provider presents the user with an MFA challenge if applicable
  6. The identity provider redirects the user back to B2C with the appropriate token(s) for that provider
  7. B2C maps the token(s) to a user in the ‘B2C User Store’ and returns the appropriate identity token to the client browser
  8. The browser requests the GraphiQL front-end, providing the B2C identity token to the API
  9. The API validates the token and requests an access token from B2C on behalf of the user for the scopes required to access the API
  10. The API returns the GraphiQL front-end, with embedded access token
  11. GraphiQL queries are POSTd to the server with the access token
  12. The access token is validated with B2C and the GraphQL queries are executed and results returned

As mentioned above, there is a lot going on here, so start on the API side of the house. Basic ASP.NET authentication with B2C is straightforward with the use of the MSAL.NET library (MSAL = Microsoft Authentication Library). The MSAL library provides a few useful configuration shortcuts that are used to configure both the API and UI side of the authentication story. The first, ‘AddMicrosoftIdentityWebApiAuthentication’, configures ‘Bearer Token’ style authentication with JSON Web Tokens (JWT) and the second, ‘AddMicrosoftIdentityWebAppAuthentication’ configures OpenID Connect based authentication with cookies. Enabling support for obtaining an access token on behalf of the calling client is as simple as adding a call to ‘EnableTokenAcquisitionToCallDownstreamApi’ after the WebApp authentication setup.

The two ‘Add’ methods mentioned above pull configuration details from the API’s appsettings.json file. This allows the MSAL library to link the application code to the app registrations that were set up earlier in the B2C tenant. Details of the configuration settings, and the calls to the methods mentioned above can be found in the original blog post here.

Once the authentication / authorization pieces are configured, they can be utilized by the GraphQL server authorization framework to control access to the GraphQL API. The GraphQL authorization framework uses a policy-based approach to authorization. Policies are defined by assigning a name and a basic true / false test that identifies if the current user matches that policy. Policies can be as simple as “is the user authenticated” to as complex as necessary to solve the business need. Policies should be designed to be as simple as possible while still fulfilling the business requirements – as they will be evaluated on every request to the system.

Once the policies are defined, they can be referenced by any of the GraphQL schema elements via the ‘.AuthorizeWith(<policy name>)’ extension method. During authorization checks, the “current user” for GraphQL is represented by a ‘GraphQLUserContext’, based on the current Azure AD B2C user that is accessing the API. The referenced policy(s) then validate that the user context passes the policy’s test prior to providing access to the schema element. This could be anything from a particular GraphQL query down to a single field on one of the graph types.

In this example, the API is also hosting the GraphiQL front-end. GraphiQL will need to supply the authentication information alongside each GraphQL request. Due to the nature of the ASP.NET Core and GraphQL interaction, the current user’s ASP.NET Core identity cannot be used during GraphQL user context build-up. To work around this limitation, JWT bearer tokens are generated on the client’s behalf by our ASP.NET Controller and injected into a custom version of the GraphiQL front-end. GraphiQL can then provide the bearer token with every call to the API, which then uses the token during the GraphQL user context creation process. Details on the process can be found here.

In a real application, this user information might control access to a database or other downstream API. In this demo application, the current user is provided an “instance” of the in-memory data model such that users can only see the version of the data that they are manipulating.

Publishing the application

Modern applications and development processes often require non-trivial environment configuration. The API(s), client applications, databases, and infrastructure involved often need to be configured for multiple deployment “stages” (think development, test, staging, production, etc.). Ensuring that these environments are configured in a secure, repeatable fashion is of utmost importance to minimize security and operational concerns. One way of managing this complexity is to use tools like Azure Bicep and Azure DevOps to securely automate your environment configuration and application deployments.

The result looks something like the diagram below:

Azure Bicep provides a domain specific language (DSL) to express the desired state of the environment(s). These infrastructure-as-code files and accompanying utilities can be executed against your environments to ensure they match that desired state. If gaps are found, the resource manager (mostly) seamlessly updates your environment. Most of the Azure resources used in this post can be configured with Bicep files and parameter files, with the Azure CLI filling the gaps as necessary.

Infrastructure is, of course, only half of the equation. The application code and resources provide the actual functionality that clients use to access the system. Azure DevOps steps in to monitor the code repositories and provides build and deployment scripts (via YAML files) which can turn the raw application code into deployable modules in an automated fashion. Deployments can be managed across environments, and differing configuration (secrets, endpoints, etc.) can be applied to each environment to ensure isolation. Various gating / approval features can be applied to ensure that application downtime is minimized, isolate deployments to non-business critical times, and require approvals from responsible parties prior to deploying to key environments.

For this application, Bicep files are created to represent the main components, a main “playbook” file represents the whole of the infrastructure along with a series of “module” files that represent specific resources or groups of resources. These “.bicep” files can then be referenced in the Azure DevOps pipeline and executed by the Azure CLI to ensure that the destination infrastructure is always up to date with the application’s expectations and specifications. This allows the infrastructure to evolve in a controlled fashion alongside the application.

On the build side, the YAML build definitions live in source control with the Bicep templates and application source code. When changes are made to any of the source-controlled resources, the DevOps automation kicks in and updates the environment to match. DevOps builds the application, updates the infrastructure, and then deploys the updated application to a “test” environment. Azure DevOps Library Variable Groups are used as deployment gates, controlling application promotion to environments beyond “test” via approvals, time-boxing, etc.

Adding a React Front-End

GraphiQL is great for quick troubleshooting and for testing GraphQL queries, and it would make a decent front-end if the core product was the API itself. However, usually the end product is a user-facing application, and GraphiQL is unlikely to suffice. Luckily, leveraging the Apollo Client library, MSAL.js v2, and React can provide the building blocks for a user-friendly rich client application.

The application for this example ends up looking something like this:

Building the entire application is too much for this summary, and the goal here is just to call out some of the libraries that are used by the application to ease the integration of the GraphQL API backend, Azure AD B2C, and the React toolkit.

As mentioned above, React provides the framework for our rich client application, along with Bootstrap for the styling and layout. Authentication is provided by the MSAL.js v2 library, which has a set of React providers and hooks that can be leveraged to provide a rich client login / logout experience in the browser. Lastly, the Apollo GraphQL Client library also supplies React providers and hooks to simplify asynchronous GraphQL query execution.

The API itself gets a few more features, with some additional fields to make the UI more interesting and support for Cross-Origin Resource Sharing (CORS) and multi-mode authentication. The authentication updates are necessary to allow the API to seamlessly authenticate and authorize clients from both the local GraphiQL front-end and the new React SPA. CORS allows the JavaScript powering the React app to make requests to the API without the web browser getting in the way. This is necessary because the React app is served from Azure CDN (Content Delivery Network) which resides at a different URL from our API, and (by default) browsers will prevent these cross-site scripting (XSS) requests to protect user security / privacy.

Once the React application and UI are ready for deployment, the Bicep templates and Azure DevOps build process need to be updated to accommodate deployment of the new application. Since this is a React Single-Page Application (SPA), it is common to deploy the front-end via a Content Delivery Network (Azure CDN in this case). The application also leverages Azure Storage Static Websites as the backend for the Azure CDN endpoints. In the end, the new application structure looks like this:

Details on the getting the sample application built and deployed can be found here.

Wrapping it Up

This summary covered the basics of getting a new GraphQL API configured and securely delivered to clients – including a custom React application. User authentication is provided via Azure AD B2C, allowing users of the application to sign in with a “local to the application” user or their choice of configured external identity providers (Twitter and GitHub in this example). There is a lot involved in getting everything built and configured; for more in-depth coverage of those details, visit my blog series on the topic.

If you need assistance building an application like this, then contact us. We have a robust team of software developers that can help you build what you need.

New call-to-action

Article by: Ben Gavin

Subscribe to our Newsletter

Stay informed on the latest technology news and trends

Relevant Insights

Application Modernization Yields a Competitive Edge: 5 Scenarios

Application Modernization can feel like one of those buzzwords that’s loaded with implications. We create these terms to simplify complex...
Read More

Cloud Storage: Advantages and Common Alternatives

Secure, dependable, and accessible data storage is the backbone of any modern company. As such, it’s critical to constantly evaluate...
Read More

4 Application Solutions to Modernize Your Company

It can be difficult to know where to start with your legacy system modernization. Do you need a complete, ground-up...
Read More