Skip to content

Databases

Database integration is fundamental to most modern web applications. This guide will help you choose and implement the right database solution for your Express projects.


Before diving into specific technologies, it’s important to understand the fundamental differences between SQL (relational) and NoSQL databases:

FeatureSQL (Relational)NoSQL
Data StructureTables with rows and columnsVarious (Documents, Key-Value, Wide-Column, Graph)
SchemaFixed schema, predefined structureSchema-flexible, dynamic
ScalingPrimarily vertical (scale up)Primarily horizontal (scale out)
ACID ComplianceStrong ACID propertiesVaries by database (often eventual consistency)
Query LanguageSQL (Structured Query Language)Database-specific query APIs
RelationshipsRelational with JOINsOften denormalized or reference-based
Use CasesComplex queries, transactions, when data structure is stableLarge datasets, rapid development, flexible schema requirements

MongoDB is particularly well-suited for:

  • Content-heavy applications: Blogs, CMSs, product catalogs where data structure varies
  • Real-time applications: Chat applications, analytics dashboards, IoT data collection
  • Rapid prototyping: When schema requirements are evolving and you need quick iterations
  • Document-oriented data: When your data naturally fits a document structure
  • JavaScript-centric teams: Teams already comfortable with JavaScript/JSON will find the transition easy

Mongoose adds:

  • Schema validation to MongoDB
  • Middleware hooks for data processing
  • Type casting and business logic methods
  • Simplified query syntax

PostgreSQL excels for:

  • Complex data relationships: When you need to enforce relationships between different entities
  • Data integrity requirements: Applications where data consistency is critical (financial, healthcare)
  • Complex reporting: Systems requiring advanced analytics, reporting with complex queries
  • Structured data: When your data has a stable structure that doesn’t change often
  • Compliance requirements: Applications that need strict ACID compliance

Using a singleton pattern ensures that only one database connection is created and reused throughout your application:

// db/connection.js
const { Sequelize } = require('sequelize');
require('dotenv').config();
class SequelizeConnection {
constructor() {
this.sequelize = null;
this.config = {
database: process.env.DB_NAME,
username: process.env.DB_USER,
password: process.env.DB_PASSWORD,
host: process.env.DB_HOST,
port: process.env.DB_PORT,
dialect: 'postgres', // or 'mysql', 'sqlite', 'mariadb', 'mssql'
logging: process.env.NODE_ENV === 'development' ? console.log : false,
pool: {
max: 5,
min: 0,
acquire: 30000,
idle: 10000
}
};
}
connect() {
if (this.sequelize) {
console.log('Using existing Sequelize connection');
return this.sequelize;
}
this.sequelize = new Sequelize(
this.config.database,
this.config.username,
this.config.password,
{
host: this.config.host,
port: this.config.port,
dialect: this.config.dialect,
logging: this.config.logging,
pool: this.config.pool
}
);
// Test the connection
this.sequelize.authenticate()
.then(() => {
console.log('Sequelize connected successfully');
})
.catch(err => {
console.error('Sequelize connection error:', err);
process.exit(1);
});
return this.sequelize;
}
}
// Export singleton instance
module.exports = new SequelizeConnection();
// app.js
const express = require('express');
const sequelizeConnection = require('./db/connection');
const app = express();
// Connect to database before starting server
function startServer() {
try {
// Get Sequelize connection
const sequelize = sequelizeConnection.connect();
const PORT = process.env.PORT || 3000;
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});
} catch (error) {
console.error('Failed to start server:', error);
process.exit(1);
}
}
startServer();
// services/userService.js
const prismaConnection = require('../db/connection');
// Get Prisma client
const prisma = prismaConnection.connect();
// Get user by ID with their posts
const getUserWithPosts = async (userId) => {
return await prisma.user.findUnique({
where: { id: Number(userId) },
include: {
posts: {
select: {
id: true,
title: true,
published: true
}
}
}
});
}
module.exports = getUserWithPosts;

For most projects, a structured organization of database-related files is essential:

  • Directorysrc/
    • Directorydb/
      • connection.js
      • Directorymodels/
        • userModel.js
        • postModel.js
        • commentModel.js
    • app.js
// db/models/mongodb/user.model.js
const mongoose = require('mongoose');
const bcrypt = require('bcrypt');
const userSchema = new mongoose.Schema({
username: {
type: String,
required: [true, 'Username is required'],
unique: true,
trim: true,
minlength: 3
},
email: {
type: String,
required: [true, 'Email is required'],
unique: true,
match: [/^\w+([.-]?\w+)*@\w+([.-]?\w+)*(\.\w{2,3})+$/, 'Please provide a valid email']
},
password: {
type: String,
required: [true, 'Password is required'],
minlength: 6,
select: false // Don't return password by default
},
role: {
type: String,
enum: ['user', 'admin'],
default: 'user'
},
createdAt: {
type: Date,
default: Date.now
}
}, {
timestamps: true,
toJSON: { virtuals: true },
toObject: { virtuals: true }
});
// Virtual for user's full name
userSchema.virtual('fullName').get(function() {
return `${this.firstName} ${this.lastName}`;
});
// Pre-save hook to hash password
userSchema.pre('save', async function(next) {
if (!this.isModified('password')) {
return next();
}
try {
const salt = await bcrypt.genSalt(10);
this.password = await bcrypt.hash(this.password, salt);
next();
} catch (error) {
next(error);
}
});
// Method to check password
userSchema.methods.comparePassword = async function(candidatePassword) {
return await bcrypt.compare(candidatePassword, this.password);
};
const User = mongoose.model('User', userSchema);
module.exports = User;

Database migrations track changes to your database schema over time, making it easier to collaborate and deploy changes.

// migrations/20231025120000-create-users.js
'use strict';
module.exports = {
up: async (queryInterface, Sequelize) => {
await queryInterface.createTable('users', {
id: {
allowNull: false,
autoIncrement: true,
primaryKey: true,
type: Sequelize.INTEGER
},
username: {
type: Sequelize.STRING(50),
allowNull: false,
unique: true
},
email: {
type: Sequelize.STRING(100),
allowNull: false,
unique: true
},
password: {
type: Sequelize.STRING,
allowNull: false
},
role: {
type: Sequelize.ENUM('user', 'admin'),
defaultValue: 'user'
},
created_at: {
allowNull: false,
type: Sequelize.DATE,
defaultValue: Sequelize.literal('CURRENT_TIMESTAMP')
},
updated_at: {
allowNull: false,
type: Sequelize.DATE,
defaultValue: Sequelize.literal('CURRENT_TIMESTAMP')
}
});
},
down: async (queryInterface, Sequelize) => {
await queryInterface.dropTable('users');
}
};
Terminal window
npx sequelize-cli db:migrate # Run pending migrations
npx sequelize-cli db:migrate:undo # Revert the most recent migration
npx sequelize-cli db:migrate:undo:all # Revert all migrations

With Prisma, you define your schema in the schema.prisma file and use Prisma Migrate to create and manage migrations:

Terminal window
# Create a migration without applying it
npx prisma migrate dev --create-only --name add_user_profile
# Apply migrations to development database
npx prisma migrate dev
# Apply migrations to production database
npx prisma migrate deploy
  1. Update your schema.prisma file
  2. Run npx prisma migrate dev --name descriptive_name
  3. Prisma will generate SQL migration files in prisma/migrations/
  4. Review the generated SQL before applying to production
  1. Make migrations atomic: Each migration should focus on a single change
  2. Use descriptive names: Name migrations clearly to understand their purpose
  3. Version control: Commit migration files to your repository
  4. Test migrations: Verify both up and down migrations work correctly
  5. Document breaking changes: Note any migrations that require application changes
  6. Backup before production migrations: Always create backups before applying migrations to production
  7. Validate in staging: Test migrations in a staging environment that mirrors production

Effective database error handling combines multiple approaches:

// middleware/error-handler.js
const { StatusCodes } = require('http-status-codes');
const { DatabaseError, UniqueConstraintError, ValidationError } = require('sequelize');
// Combine error logging and handling in one middleware
function databaseErrorHandler(err, req, res, next) {
// 1. Identify the error type
if (err instanceof DatabaseError) {
console.error('Sequelize Error:', {
message: err.message,
sql: err.sql,
parameters: err.parameters
});
// Custom handling for specific Sequelize errors
if (err instanceof UniqueConstraintError) {
return res.status(StatusCodes.CONFLICT).json({
status: 'error',
message: 'A record with this information already exists',
details: err.errors.map(e => e.message)
});
}
if (err instanceof ValidationError) {
return res.status(StatusCodes.BAD_REQUEST).json({
status: 'error',
message: 'Validation failed',
details: err.errors.map(e => e.message)
});
}
}
// Default error response
const statusCode = err.statusCode || StatusCodes.INTERNAL_SERVER_ERROR;
res.status(statusCode).json({
status: 'error',
message: err.message || 'Internal server error',
...(process.env.NODE_ENV === 'development' && { stack: err.stack })
});
}
module.exports = databaseErrorHandler;
// MongoDB query logging
if (process.env.NODE_ENV === 'development') {
mongoose.set('debug', {
color: true,
filter: (query) => {
// Exclude health checks or specific queries
return !query.includes('system.version');
}
});
}
  1. Categorize errors: Differentiate between operational errors (e.g., validation failures) and programming errors
  2. Use domain-specific error classes: Create custom error types for your application domains
  3. Log detailed context: Include relevant data for debugging without exposing sensitive information
  4. Implement retry mechanisms: Use exponential backoff for transient failures
  5. Implement monitoring: Set up alerts for database errors, slow queries, and connection issues
  6. Create error response standards: Maintain consistent error response formats across your API

  1. Do you need a relational model?

    • Yes: Consider PostgreSQL, especially if you need complex queries, joins, or transactions
    • No: Consider MongoDB for document-oriented data or when schema flexibility is important
  2. What’s your application’s read/write pattern?

    • Heavy reads: PostgreSQL with proper indexing
    • Heavy writes: MongoDB or PostgreSQL with write optimization
    • Balanced: Either option works well
  3. Do you need advanced data integrity?

    • Yes: PostgreSQL with its constraints and transactions
    • No: MongoDB might offer more flexibility
  4. Is rapid development a priority?

    • Yes: MongoDB might allow faster iterations with its schema flexibility
    • No: PostgreSQL’s structure enforces better data discipline

Remember that the right database choice depends on your specific application requirements, team expertise, and long-term maintenance considerations.