Building a Simple Node.js Backend Using PostgreSQL

Together they create the perfect tandem for creating quality applications.

Creating and configuring a project

We will do everything on Windows.

Let's install it

Node.Js

Let's move on Node.js official website and download the installation file for Windows.

Next, run the installation file and follow the instructions of the installation wizard.

Here it is also important that the checkbox for installation is selected npm together with Node.js.

After the installation is complete, open the command line and enter node -v And npm -vto check if the installation was successful.

PostgreSQL

Download PostgreSQL from official website. Installation there is quite simple, but it is important to check that pgAdmin and psql are enabled.

Next we create the database itself and the user.

Open the command line or terminal.

Log in to the PostgreSQL console as a superuser (usually postgres):

sudo -i -u postgres
psql

Create a new database:

CREATE DATABASE mydatabase;

Create a new user with password:

CREATE USER myuser WITH ENCRYPTED PASSWORD 'qwerty';

We give the user rights to the created database:

GRANT ALL PRIVILEGES ON DATABASE mydatabase TO myuser;

Exit the PostgreSQL console with the command \q.

Initializing a new Node.js project

Let's create a new directory for the project:

mkdir my-node-project
cd my-node-project

Initialize the project using npm:

npm init -y

This will create a file package.jsonwhich will contain basic information about the project, including dependencies and scripts for running the project.

Let's install the necessary packages to work with Express and PostgreSQL:

npm install express pg dotenv
  • Express: a framework for creating server applications on Node.js.

  • pg: official client for PostgreSQL.

  • dotenv: to manage environment variables.

We organize the project in such a way that it is easy to manage and expand:

my-node-project/
│
├── src/
│   ├── routes/
│   │   └── index.js
│   ├── controllers/
│   │   └── userController.js
│   ├── models/
│   │   └── userModel.js
│   ├── config/
│   │   └── db.js
│   └── app.js
├── .env
├── package.json
└── README.md
  • routes: application routes.

  • controllers: request processing logic.

  • models: database models.

  • config: config that includes settings for connecting to the database.

  • app.js: the main file of the application.

Connecting to a database

Let's create a configuration file to connect to the database, for example, db.js:

const { Pool } = require('pg');
require('dotenv').config();

const pool = new Pool({
  user: process.env.DB_USER,
  host: process.env.DB_HOST,
  database: process.env.DB_NAME,
  password: process.env.DB_PASSWORD,
  port: process.env.DB_PORT,
});

module.exports = pool;

Add environment variables to the file .env:

DB_USER=yourusername
DB_HOST=localhost
DB_NAME=yourdatabase
DB_PASSWORD=yourpassword
DB_PORT=5432

Create a test request in app.jsto make sure the database connection is established:

const express = require('express');
const pool = require('./config/db');

const app = express();

pool.query('SELECT NOW()', (err, res) => {
  if(err) {
    console.error('Error connecting to the database', err.stack);
  } else {
    console.log('Connected to the database:', res.rows);
  }
});

const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
  console.log(`Server running on port ${PORT}`);
});

Building an API using Express

Let's create a route file index.js and we will define the basic routes:

const express = require('express');
const router = express.Router();
const userController = require('../controllers/userController');

router.get('/users', userController.getUsers);
router.post('/users', userController.createUser);
router.put('/users/:id', userController.updateUser);
router.delete('/users/:id', userController.deleteUser);

module.exports = router;

In the controller userController.js we describe the logic of interaction with the database:

const pool = require('../config/db');

exports.getUsers = async (req, res) => {
  try {
    const result = await pool.query('SELECT * FROM users');
    res.status(200).json(result.rows);
  } catch (err) {
    res.status(500).json({ error: err.message });
  }
};

exports.createUser = async (req, res) => {
  const { name, email } = req.body;
  try {
    const result = await pool.query('INSERT INTO users (name, email) VALUES ($1, $2) RETURNING *', [name, email]);
    res.status(201).json(result.rows[0]);
  } catch (err) {
    res.status(500).json({ error: err.message });
  }
};

exports.updateUser = async (req, res) => {
  const { id } = req.params;
  const { name, email } = req.body;
  try {
    const result = await pool.query('UPDATE users SET name = $1, email = $2 WHERE id = $3 RETURNING *', [name, email, id]);
    res.status(200).json(result.rows[0]);
  } catch (err) {
    res.status(500).json({ error: err.message });
  }
};

exports.deleteUser = async (req, res) => {
  const { id } = req.params;
  try {
    await pool.query('DELETE FROM users WHERE id = $1', [id]);
    res.status(204).send();
  } catch (err) {
    res.status(500).json({ error: err.message });
  }
};

We use middleware for data validation and error handling:

const express = require('express');
const app = express();
const routes = require('./routes');
const { body, validationResult } = require('express-validator');

app.use(express.json());

app.post('/users', 
  body('email').isEmail(),
  body('name').notEmpty(),
  (req, res, next) => {
    const errors = validationResult(req);
    if (!errors.isEmpty()) {
      return res.status(400).json({ errors: errors.array() });
    }
    next();
  },
  routes
);

app.use('/api', routes);

app.use((err, req, res, next) => {
  console.error(err.stack);
  res.status(500).send('Something broke!');
});

const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
  console.log(`Server running on port ${PORT}`);
});

Testing and debugging

Now that the API is created and connected to the database, you can and should start procrastinating and testing.

Postman will cope with this task well.

  1. Create a request in Postman:

    • Launch Postman and create a new request.

    • Enter the API URL, for example, http://localhost:3000/api/users.

    • Select the request method: GET, POST, PUT, DELETE.

    • If the request requires a body, such as POST or PUT, add the JSON data in the Body section.

  2. Sending a request and checking the response:

Logging can be implemented

Let's install the library winston for logging:

npm install winston

Let's create a file logger.js to configure logging:

const { createLogger, format, transports } = require('winston');

const logger = createLogger({
  level: 'info',
  format: format.combine(
    format.timestamp(),
    format.json()
  ),
  transports: [
    new transports.Console(),
    new transports.File({ filename: 'error.log', level: 'error' }),
    new transports.File({ filename: 'combined.log' })
  ],
});

module.exports = logger;

Import and use the logger in app.js and other files:

const logger = require('./logger');

app.use((req, res, next) => {
  logger.info(`${req.method} ${req.url}`);
  next();
});

app.use((err, req, res, next) => {
  logger.error(err.message);
  res.status(500).send('Something broke!');
});

Now we have already created a ready and simple back-end on Node.js using PostgreSQL, and also connected logging and Postman. By the way, here you can also add unit testing using the library Jest.

Building a back-end on Node.js using PostgreSQL allows you to take advantage of all the features of these technologies!


In conclusion, I would like to remind you about the upcoming open lessons:

  • July 18: Generics in Go. In this webinar, you will learn the mechanisms of generic programming using generics. We will look at the internal mechanisms of generics in Go, as well as examples of use. Sign up via link

  • July 25: How to make a fast-growing service using tracing? In the webinar, we will clearly examine the service operation under load and find a request using tracing. We will show cases where logging is already in place. Sign up via link

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *