How to deploy without having to try Vercel or MongoDB Atlas
Now you can deploy your web application to Vercel, and it will use a MongoDB Atlas cluster as its data store. Here’s how.
Vercel-MongoDB integration is one of the most exciting announcements at MongoDB World 2022. We are witnessing the standardization and gradual simplification of large infrastructure components. With minimal architectural fuss, you can capitalize on the enormous power of your data store and host.
You will spend more time focusing on your core goals if you have less to do with architecture. Let’s look at the new, more efficient integration.
It was already quite simple to use MongoDB with Vercel. The new official integration is a standard approach and offers many niceties. Let’s continue with the project that I used to demonstrate Vercel-MongoDB integration before to show you how it works.
Your MONGODB_URI
environment variable is the hinge between your Vercel infrastructure and MongoDB Atlas infrastructure. This is shared with you by the official integration, which handles all necessary permissions.
Our demo here is a SvelteKit app. It’s a locally developed app with a MongoDB datastore. Next, we want this app to be deployed to Vercel. It will then automatically use a MongoDB Atlas cluster as its datastore. This setup is very common. This works for Vue/Nuxt and React/Next, too. Figure 1 shows the basic configuration.
Read also: QA Website Checklist
The SvelteKit demo application
The SvelteKit demo app allows you to create a simple document, an “apothegm”, with text and author fields. It also displays the list of apothegms stored in the database. We’ll be using some short apothegms to create the text for our documents.
SvelteKit can be used as a full-stack framework. We’ll use the load approach in our view to hit back end and load any apothegms that are already loaded, as shown in Listing
Listing 1. Listing 1.
Export async load(params, fetch and session, stuff) const = await fetch('/apothegm), method:"GET", headers content-type:'application/json’ return props apothegms :awaiting res.json();
More information on SvelteKit’s load method can be found here. The main idea is that we hit the back-end API before the page bootstraps and then insert the JSON from the props.apothegms
fields. This field is also accessible on regular script elements, so the page can access it with the line let apothegms;
.
The /apothegm Get method on the back looks like Listing 2.
Listing 2. Listing 2.
import clientPromise from '../lib/mongo'; export async function get (request) const dbConnection = await clientPromise; const db = dbConnection.db("apothegm"); const collection = db.collection("apothegm"); let apos = await collection.find().toArray(); return status: 200, headers: 'content-type': 'application/json' , body: apos ;
Listing 2 returns a body containing apothegms that were retrieved from the database, i.e. from the apothegm collections and the apothegm databases. This method relies heavily on the clientPromise
object imported into lib/mongo. Let’s take a look at the relevant parts of this module in Listing 3.
Listing 3. Listing 3.
import 'dotenv/config'; import MongoClient from 'mongodb'; const uri = process.env["MONGODB_URI"]; const options = useUnifiedTopology: true, useNewUrlParser: true, let client; let clientPromise; if (!uri) throw new Error('Please set Mongo URI') if (process.env['NODE_ENV'] === 'development') if (!global._mongoClientPromise) client = new MongoClient(uri, options) global._mongoClientPromise = client.connect() clientPromise = global._mongoClientPromise else client = new MongoClient(uri, options) clientPromise = client.connect() export default clientPromise
Listing 3 creates a MongoDB connection via the MONGODB_URI
environment variable. This variable is pulled from the environment with the process.env["MONGODB_URI"]
call.
The first line in this file calls for to be imported
. This import causes dotenv to bootstrap itself. Dotenv’s purpose is to load environment variables from config files for the app (in an OS-agnostic manner). has more information.
We want that the variable be set to a local URI for development and to a remote MongoDB Atlas URL for production. This can be done by providing a file called.env that dotenv will locate during local development, but not in production. We don’t put the file in version control, but we add it to.gitignore. Listing 4 shows the relevant bits from both files.
Listing 4. Listing 4.
// .env MONGODB_URI="mongodb://localhost:27017
// .gitignore .env .env. *
This means that our app will be available to the MongoDB installation during development. You can now launch the app using npm dev
, and everything should work.
Production setup at Vercel-MongoDB
Once dev is running, we will move on to setting up prod or what we call prod. In real life, you would use staging and testing to get from staging to production.
You will need accounts in Vercel or MongoDB Atlas for any occasion. Both services offer hobby accounts for free that are easy and quick to set up. MongoDB sign-up, Vercel log up.
Vercel will import the project.
After you have created your accounts, you will be able to log in to Vercel and create a new project. You can also import the code from the project. Here’s that GitHub project again: https://github.com/MTyson/sveltekit-vercel-mongo.
Vercel will build the project and deploy it once the import has been completed. (Sigh of relief: There is no additional infrastructure work. Although the build will be successful, it will show an error when you look at it. This is because there isn’t a data store.
Create a MongoDB Atlas cluster
You now want to create MongoDB Atlas cluster to house your production data. It is very easy to create a MongoDB free cluster. You will also need to create a database user.
To add integration, go back to Vercel
Once you have created a cluster, the next step will be to integrate MongoDB to your Vercel account. This can be added to your Vercel team in an enterprise setting. Navigate to https://vercel.com/integrations/mongodbatlas and click the “Add Integration” button at the top. A modal will appear with a drop-down menu.
Next, you have the option to add the integration either to all projects or to one specific project. Let’s skip this step and choose “Add to All”.
Temporarily back at MongoDB Atlas
The integration can be added to another window. This will open a MongoDB signup webpage. You can create a MongoDB account if you don’t have one. Log in to an existing account.
Next, a dialogue will ask you which MongoDB Atlas company to add the integration. Use the default setting you have set for your user.
Next, click the Acknowledge button at the bottom of the screen. This will confirm that you are willing to uninstall the integration manually if you wish it to be removed.
You’ll now see the MongoDB Atlas project and all the clusters it contains. The clusters in the left-hand dropbox will be associated to Vercel projects in either the right-hand multiselect or the left-hand one. In this case, we will add the Vercel project that we created earlier to our right-hand selection.
Next, we will create the bridge between MongoDB Atlas and Vercel. Select the right project for the Vercel side.
This will allow the Vercel project to automatically access the environment variable ( MONODB_URL
). We can then connect to the data store with ease once this is done.
To test, go back to Vercel
You’ll find the MongoDB Atlas integrations back in Vercel under the Integrations tab.
You can then make any changes to the Vercel integration.
Next, we confirm that the Vercel project now has the MongoDB Atlas installation as an environmental variable. Click Settings to open the Vercel project. Click Settings and then “Environment Variables”. As in Screen 5, you should see a MONGODB_URI
listed there.
Click the eye icon in the variable to see the value. This should point to your MongoDB Atlas Cluster. This confirms that the environment variable was available to the app upon deployment.
Vercel-MongoDB integration allows us to easily connect our Vercel application to the MongoDB Atlas data store. Similar methods could be used to associate apps with data in different environments: from production to test to staging to production.
The integration provides a standardized approach to leveraging global infrastructure “scale to zero” with minimal fuss.