Atlas as a service

The translation of the material was prepared as part of the course “NoSQL”.

We also invite everyone to a two-day intensive MongoDB Map-Reduce Framework.
– Topics 1 day: CRUD operations; filtering by fields; sort, skip, limit; requests for subdocuments.
– Topics 2 days: the concept of map-reduce; concept pipeline; structure and syntax of aggregation; stage $match; stage $group; stage $lookup


Many of our clients provide MongoDB as a service to their development teams. Developers can query a MongoDB database instance and get the connection string and credentials in minutes. Switching to use MongoDB Atlas, our clients are also interested in providing their developers with an appropriate level of service.

Atlas has a very powerful control layer for providing clusters. However, in large organizations with thousands of developers, it may not always make sense to give so many people direct access to this interface. The purpose of this article is to show you how you can use Atlas APIs to provide access to MongoDB as a service when MongoDB is running under Atlas.

Specifically, we’ll demonstrate a way to create an interface that offers developers a set of options for creating an instance of a MongoDB database. To make things easier for ourselves, let’s look at how to provide developers with a set of memory and storage options for configuring a cluster. We will not consider other parameters, for example, the choice of a cloud service provider and region. We will also cover how to add labels to Atlas clusters, as this feature is not supported in the Atlas UI. For example, we have added a label to describe the cluster.

Architecture

Although the Atlas APIs can be called directly from the client interface, we chose to use a three-tier architecture. Its advantages are as follows:

  • the ability to restrict the available functionality as needed;

  • the ability to simplify the APIs available to front-end developers of applications;

  • the ability to fine-tune the protection of API endpoints.

  • We could take advantage of other server-side functions, for example triggers, integration with Twilio etc.

Of course, we chose Realm

Implementation

Server part

API Atlas

Atlas APIs wrapped in a set Realm functions

For the most part, they all call the Atlas API as follows (here we have taken as an example getOneCluster):

/*
* Gets information about the requested cluster. If clusterName is empty, all clusters will be fetched.
* See https://docs.atlas.mongodb.com/reference/api/clusters-get-one
*
*/
exports = async function(username, password, projectID, clusterName) 
{

	const arg = {
		scheme: 'https',
		host: 'cloud.mongodb.com',
		path: 'api/atlas/v1.0/groups/' + projectID +'/clusters/' + 
clusterName,
		username: username,
		password: password,
		headers: {'Content-Type': ['application/json'], 
'Accept-Encoding': ['bzip, deflate']},
		digestAuth:true
	};

	// The response body is a BSON.Binary object. Parse it and return.
	response = await context.http.get(arg);
	return EJSON.parse(response.body.text());
};

The source code for each function is hosted on Github

API MiniAtlas

The next step is to present the functions as endpoints that the client side can use. Alternatively, we could call functions with Realm Web SDKbut we decided to stick with the REST protocol; it is more familiar to our web developers.

Using the functionality third party services, we have developed the following 6 endpoints:

API

Method type

End point

Getting a list of clusters

GET

/ getClusters

Cluster creation

POST

/ getClusters

Cluster status output

GET

/ getClusterState? clusterName: cn

Cluster change

PATCH

/ modifyCluster

Suspending or Resuming a Cluster

POST

/ pauseCluster

Deleting a cluster

DELETE

/ deleteCluster? clusterName: cn

Following is the source code of the endpoint getClusters (note: username and password are retrieved from constants Value and Secret):

/*
* GET getClusters
*
* Query Parameters
*
* None
*
* Response - Currently all values documented at https://docs.atlas.mongodb.com/reference/api/clusters-get-all/
*
*/
exports = async function(payload, response) {

	var results = [];

	const username = context.values.get("username");
	const password = context.values.get("apiKey");
	projectID = context.values.get("projectID");

	// Sending an empty clusterName will return all clusters.
	var clusterName="";

	response = await context.functions.execute("getOneCluster", username, password, 
	projectID, clusterName);
	results = response.results;

	return results;
};

The source code for each webhook is hosted at Github

When you save the webhook, a URL is generated that serves as the endpoint for the API:

API endpoint protection

Only authenticated users can execute functions provided by API endpoints. When calling the API, you must pass an authorization header with a valid user ID. The endpoint will run this identifier through the following function:

exports = function(payload) {
	const headers = context.request.requestHeaders
	const { Authorization } = headers
	const user_id = Authorization.toString().replace(/^Bearer/, '')
	return user_id
};

MongoDB Realm has several built-in authentication providers including access for anonymous users, access by combination email / password, access by API keysas well as OAuth 2.0 authentication via Facebook, Google and Apple ID

For this example, we decided to use the service Google OAuth – primarily because it is already integrated with the single sign-on provider that we use at our company.

The choice of the provider is not important. Regardless of which providers are enabled, an associated user ID will be generated that can be used to authenticate when accessing the API.

Client part

The client part is implemented on JQuery and hosted in Realm

Authentication

Via MongoDB Stitch Browser SDK the client prompts the user to sign in to a google account (if not already signed in) and passes the user’s google credentials to StitchAppClient

let credential = new stitch.GoogleRedirectCredential();
client.auth.loginWithRedirect(credential);

The user id to be used when submitting an API request to the back end can be obtained from StitchAppClient in the following way:

let userId = client.auth.authInfo.userId;

It can then be included in the header when calling the API. Here is an example of an API call createCluster:

export const createCluster = (uid, data) => {
	let url = `${baseURL}/createCluster`

	const params = {
		method: "post",
		headers: {
			"Content-Type": "application/json;charset=utf-8",
			...(uid && { Authorization: uid })
		},
		...(data && { body: JSON.stringify(data) })
	}

	return fetch(url, params)
	.then(handleErrors)
	.then(response => response.json())
	.catch(error => console.log(error) );
};

All API calls can be viewed in the file webhooks.js

Helpful advice

We have benefited greatly team workspaces in Postman… This tool allows you to collaboratively develop and validate server APIs.

Conclusion

This prototype was created to showcase the capabilities that we hope you already have an idea of! There is a solution basis. How to use it, you decide for yourself.


Learn more about the course “NoSQL”

Participate in a two-day intensive MongoDB Map-Reduce Framework

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *