Databases
Database integration is fundamental to most modern web applications. This guide will help you choose and implement the right database solution for your Express projects.
1. Choosing the Right Database
Section titled “1. Choosing the Right Database”SQL vs NoSQL: Understanding the Paradigms
Section titled “SQL vs NoSQL: Understanding the Paradigms”Before diving into specific technologies, it’s important to understand the fundamental differences between SQL (relational) and NoSQL databases:
Feature | SQL (Relational) | NoSQL |
---|---|---|
Data Structure | Tables with rows and columns | Various (Documents, Key-Value, Wide-Column, Graph) |
Schema | Fixed schema, predefined structure | Schema-flexible, dynamic |
Scaling | Primarily vertical (scale up) | Primarily horizontal (scale out) |
ACID Compliance | Strong ACID properties | Varies by database (often eventual consistency) |
Query Language | SQL (Structured Query Language) | Database-specific query APIs |
Relationships | Relational with JOINs | Often denormalized or reference-based |
Use Cases | Complex queries, transactions, when data structure is stable | Large datasets, rapid development, flexible schema requirements |
When to Choose MongoDB with Mongoose
Section titled “When to Choose MongoDB with Mongoose”MongoDB is particularly well-suited for:
- Content-heavy applications: Blogs, CMSs, product catalogs where data structure varies
- Real-time applications: Chat applications, analytics dashboards, IoT data collection
- Rapid prototyping: When schema requirements are evolving and you need quick iterations
- Document-oriented data: When your data naturally fits a document structure
- JavaScript-centric teams: Teams already comfortable with JavaScript/JSON will find the transition easy
Mongoose adds:
- Schema validation to MongoDB
- Middleware hooks for data processing
- Type casting and business logic methods
- Simplified query syntax
When to Choose PostgreSQL
Section titled “When to Choose PostgreSQL”PostgreSQL excels for:
- Complex data relationships: When you need to enforce relationships between different entities
- Data integrity requirements: Applications where data consistency is critical (financial, healthcare)
- Complex reporting: Systems requiring advanced analytics, reporting with complex queries
- Structured data: When your data has a stable structure that doesn’t change often
- Compliance requirements: Applications that need strict ACID compliance
2. Database Connection Patterns
Section titled “2. Database Connection Patterns”The Singleton Pattern
Section titled “The Singleton Pattern”Using a singleton pattern ensures that only one database connection is created and reused throughout your application:
// db/connection.jsconst { Sequelize } = require('sequelize');require('dotenv').config();
class SequelizeConnection { constructor() { this.sequelize = null; this.config = { database: process.env.DB_NAME, username: process.env.DB_USER, password: process.env.DB_PASSWORD, host: process.env.DB_HOST, port: process.env.DB_PORT, dialect: 'postgres', // or 'mysql', 'sqlite', 'mariadb', 'mssql' logging: process.env.NODE_ENV === 'development' ? console.log : false, pool: { max: 5, min: 0, acquire: 30000, idle: 10000 } }; }
connect() { if (this.sequelize) { console.log('Using existing Sequelize connection'); return this.sequelize; }
this.sequelize = new Sequelize( this.config.database, this.config.username, this.config.password, { host: this.config.host, port: this.config.port, dialect: this.config.dialect, logging: this.config.logging, pool: this.config.pool } );
// Test the connection this.sequelize.authenticate() .then(() => { console.log('Sequelize connected successfully'); }) .catch(err => { console.error('Sequelize connection error:', err); process.exit(1); });
return this.sequelize; }}
// Export singleton instancemodule.exports = new SequelizeConnection();
// db/connection.jsconst mongoose = require('mongoose');require('dotenv').config();
class MongoConnection { constructor() { this.mongoose = mongoose; this.isConnected = false; this.MONGO_URI = process.env.MONGO_URI; }
async connect() { if (this.isConnected) { console.log('Using existing MongoDB connection'); return; }
try { await this.mongoose.connect(this.MONGO_URI, { useNewUrlParser: true, useUnifiedTopology: true, });
this.isConnected = true; console.log('MongoDB connected successfully'); } catch (error) { console.error('MongoDB connection error:', error); process.exit(1); } }}
// Export singleton instancemodule.exports = new MongoConnection();
// db/connection.jsconst { PrismaClient } = require('@prisma/client');require('dotenv').config();
class PrismaConnection { constructor() { this.prisma = null; this.options = process.env.NODE_ENV === 'development' ? { log: ['query', 'info', 'warn', 'error'], errorFormat: 'pretty' } : { log: ['warn', 'error'] }; }
connect() { if (this.prisma) { console.log('Using existing Prisma connection'); return this.prisma; }
try { this.prisma = new PrismaClient(this.options); console.log('Prisma connected successfully');
// Optional: Register shutdown handlers process.on('beforeExit', async () => { await this.disconnect(); });
return this.prisma; } catch (error) { console.error('Prisma connection error:', error); process.exit(1); } }
async disconnect() { if (!this.prisma) { return; }
await this.prisma.$disconnect(); this.prisma = null; console.log('Prisma disconnected'); }}
// Export singleton instancemodule.exports = new PrismaConnection();
Using the Connection in Your Application
Section titled “Using the Connection in Your Application”// app.jsconst express = require('express');const sequelizeConnection = require('./db/connection');
const app = express();
// Connect to database before starting serverfunction startServer() { try { // Get Sequelize connection const sequelize = sequelizeConnection.connect();
const PORT = process.env.PORT || 3000; app.listen(PORT, () => { console.log(`Server running on port ${PORT}`); }); } catch (error) { console.error('Failed to start server:', error); process.exit(1); }}
startServer();
// app.jsconst express = require('express');const mongoConnection = require('./db/connection');
const app = express();
// Connect to database before starting serverasync function startServer() { try { // Connect to MongoDB await mongoConnection.connect();
const PORT = process.env.PORT || 3000; app.listen(PORT, () => { console.log(`Server running on port ${PORT}`); }); } catch (error) { console.error('Failed to start server:', error); process.exit(1); }}
startServer();
// app.jsconst express = require('express');const prismaConnection = require('./db/connection');
const app = express();
// Connect to database before starting serverfunction startServer() { try { // Get Prisma connection const prisma = prismaConnection.connect();
const PORT = process.env.PORT || 3000; app.listen(PORT, () => { console.log(`Server running on port ${PORT}`); });
// Handle graceful shutdown const shutdown = async () => { console.log('Shutting down server...'); await prismaConnection.disconnect(); process.exit(0); };
process.on('SIGINT', shutdown); process.on('SIGTERM', shutdown);
} catch (error) { console.error('Failed to start server:', error); process.exit(1); }}
startServer();
Example: Using Prisma in a Service Layer
Section titled “Example: Using Prisma in a Service Layer”// services/userService.jsconst prismaConnection = require('../db/connection');
// Get Prisma clientconst prisma = prismaConnection.connect();
// Get user by ID with their postsconst getUserWithPosts = async (userId) => { return await prisma.user.findUnique({ where: { id: Number(userId) }, include: { posts: { select: { id: true, title: true, published: true } } } });}
module.exports = getUserWithPosts;
3. Structuring Database Models
Section titled “3. Structuring Database Models”Model Organization
Section titled “Model Organization”For most projects, a structured organization of database-related files is essential:
Directorysrc/
Directorydb/
- connection.js
Directorymodels/
- userModel.js
- postModel.js
- commentModel.js
- app.js
Creating Database Models
Section titled “Creating Database Models”// db/models/mongodb/user.model.jsconst mongoose = require('mongoose');const bcrypt = require('bcrypt');
const userSchema = new mongoose.Schema({ username: { type: String, required: [true, 'Username is required'], unique: true, trim: true, minlength: 3 }, email: { type: String, required: [true, 'Email is required'], unique: true, match: [/^\w+([.-]?\w+)*@\w+([.-]?\w+)*(\.\w{2,3})+$/, 'Please provide a valid email'] }, password: { type: String, required: [true, 'Password is required'], minlength: 6, select: false // Don't return password by default }, role: { type: String, enum: ['user', 'admin'], default: 'user' }, createdAt: { type: Date, default: Date.now }}, { timestamps: true, toJSON: { virtuals: true }, toObject: { virtuals: true }});
// Virtual for user's full nameuserSchema.virtual('fullName').get(function() { return `${this.firstName} ${this.lastName}`;});
// Pre-save hook to hash passworduserSchema.pre('save', async function(next) { if (!this.isModified('password')) { return next(); }
try { const salt = await bcrypt.genSalt(10); this.password = await bcrypt.hash(this.password, salt); next(); } catch (error) { next(error); }});
// Method to check passworduserSchema.methods.comparePassword = async function(candidatePassword) { return await bcrypt.compare(candidatePassword, this.password);};
const User = mongoose.model('User', userSchema);
module.exports = User;
// db/models/postgresql/user.model.jsconst { DataTypes } = require('sequelize');const bcrypt = require('bcrypt');const sequelize = require('../../connection').postgresSequelize;
const User = sequelize.define('User', { id: { type: DataTypes.INTEGER, autoIncrement: true, primaryKey: true }, username: { type: DataTypes.STRING(50), allowNull: false, unique: true, validate: { len: [3, 50] } }, email: { type: DataTypes.STRING(100), allowNull: false, unique: true, validate: { isEmail: true } }, password: { type: DataTypes.STRING, allowNull: false, validate: { len: [6, 100] } }, role: { type: DataTypes.ENUM('user', 'admin'), defaultValue: 'user' }}, { timestamps: true, hooks: { beforeCreate: async (user) => { const salt = await bcrypt.genSalt(10); user.password = await bcrypt.hash(user.password, salt); }, beforeUpdate: async (user) => { if (user.changed('password')) { const salt = await bcrypt.genSalt(10); user.password = await bcrypt.hash(user.password, salt); } } }});
// Instance method to check passwordUser.prototype.comparePassword = async function(candidatePassword) { return await bcrypt.compare(candidatePassword, this.password);};
module.exports = User;
// prisma/schema.prismagenerator client { provider = "prisma-client-js"}
datasource db { provider = "postgresql" url = env("DATABASE_URL")}
model User { id Int @id @default(autoincrement()) username String @unique @db.VarChar(50) email String @unique @db.VarChar(100) password String role Role @default(USER) createdAt DateTime @default(now()) @map("created_at") updatedAt DateTime @updatedAt @map("updated_at") posts Post[]
@@map("users")}
model Post { id Int @id @default(autoincrement()) title String @db.VarChar(255) content String? published Boolean @default(false) authorId Int @map("author_id") createdAt DateTime @default(now()) @map("created_at") updatedAt DateTime @updatedAt @map("updated_at") author User @relation(fields: [authorId], references: [id])
@@map("posts")}
enum Role { USER ADMIN}
4. Database Migrations
Section titled “4. Database Migrations”Database migrations track changes to your database schema over time, making it easier to collaborate and deploy changes.
Sequelize Migrations
Section titled “Sequelize Migrations”// migrations/20231025120000-create-users.js'use strict';
module.exports = { up: async (queryInterface, Sequelize) => { await queryInterface.createTable('users', { id: { allowNull: false, autoIncrement: true, primaryKey: true, type: Sequelize.INTEGER }, username: { type: Sequelize.STRING(50), allowNull: false, unique: true }, email: { type: Sequelize.STRING(100), allowNull: false, unique: true }, password: { type: Sequelize.STRING, allowNull: false }, role: { type: Sequelize.ENUM('user', 'admin'), defaultValue: 'user' }, created_at: { allowNull: false, type: Sequelize.DATE, defaultValue: Sequelize.literal('CURRENT_TIMESTAMP') }, updated_at: { allowNull: false, type: Sequelize.DATE, defaultValue: Sequelize.literal('CURRENT_TIMESTAMP') } }); },
down: async (queryInterface, Sequelize) => { await queryInterface.dropTable('users'); }};
Running Sequelize Migrations
Section titled “Running Sequelize Migrations”npx sequelize-cli db:migrate # Run pending migrationsnpx sequelize-cli db:migrate:undo # Revert the most recent migrationnpx sequelize-cli db:migrate:undo:all # Revert all migrations
Prisma Migrations
Section titled “Prisma Migrations”With Prisma, you define your schema in the schema.prisma
file and use Prisma Migrate to create and manage migrations:
# Create a migration without applying itnpx prisma migrate dev --create-only --name add_user_profile
# Apply migrations to development databasenpx prisma migrate dev
# Apply migrations to production databasenpx prisma migrate deploy
Handling Schema Changes with Prisma
Section titled “Handling Schema Changes with Prisma”- Update your
schema.prisma
file - Run
npx prisma migrate dev --name descriptive_name
- Prisma will generate SQL migration files in
prisma/migrations/
- Review the generated SQL before applying to production
Migration Best Practices
Section titled “Migration Best Practices”- Make migrations atomic: Each migration should focus on a single change
- Use descriptive names: Name migrations clearly to understand their purpose
- Version control: Commit migration files to your repository
- Test migrations: Verify both
up
anddown
migrations work correctly - Document breaking changes: Note any migrations that require application changes
- Backup before production migrations: Always create backups before applying migrations to production
- Validate in staging: Test migrations in a staging environment that mirrors production
5. Database Error Handling and Debugging
Section titled “5. Database Error Handling and Debugging”Comprehensive Error Handling Strategy
Section titled “Comprehensive Error Handling Strategy”Effective database error handling combines multiple approaches:
// middleware/error-handler.jsconst { StatusCodes } = require('http-status-codes');const { DatabaseError, UniqueConstraintError, ValidationError } = require('sequelize');
// Combine error logging and handling in one middlewarefunction databaseErrorHandler(err, req, res, next) { // 1. Identify the error type if (err instanceof DatabaseError) { console.error('Sequelize Error:', { message: err.message, sql: err.sql, parameters: err.parameters });
// Custom handling for specific Sequelize errors if (err instanceof UniqueConstraintError) { return res.status(StatusCodes.CONFLICT).json({ status: 'error', message: 'A record with this information already exists', details: err.errors.map(e => e.message) }); }
if (err instanceof ValidationError) { return res.status(StatusCodes.BAD_REQUEST).json({ status: 'error', message: 'Validation failed', details: err.errors.map(e => e.message) }); } }
// Default error response const statusCode = err.statusCode || StatusCodes.INTERNAL_SERVER_ERROR; res.status(statusCode).json({ status: 'error', message: err.message || 'Internal server error', ...(process.env.NODE_ENV === 'development' && { stack: err.stack }) });}
module.exports = databaseErrorHandler;
// middleware/error-handler.jsconst { StatusCodes } = require('http-status-codes');const mongoose = require('mongoose');
// Combine error logging and handling in one middlewarefunction databaseErrorHandler(err, req, res, next) { // Handle Mongoose errors if (err instanceof mongoose.Error) { console.error('Mongoose Error:', { name: err.name, message: err.message, path: err.path, value: err.value });
if (err instanceof mongoose.Error.ValidationError) { return res.status(StatusCodes.BAD_REQUEST).json({ statusCode: StatusCodes.BAD_REQUEST, message: 'Validation error', path: req.originalUrl, timestamp: new Date().toISOString(), details: Object.values(err.errors).map(e => e.message) }); }
if (err instanceof mongoose.Error.CastError) { return res.status(StatusCodes.BAD_REQUEST).json({ statusCode: StatusCodes.BAD_REQUEST, message: 'Invalid ID format', path: req.originalUrl, timestamp: new Date().toISOString(), details: `Cannot cast ${err.value} to ${err.kind}` }); }
if (err.code === 11000) { // Duplicate key error return res.status(StatusCodes.CONFLICT).json({ statusCode: StatusCodes.CONFLICT, message: 'Duplicate value error', path: req.originalUrl, timestamp: new Date().toISOString(), details: `A record with this ${Object.keys(err.keyValue)} already exists` }); } }
// Default error response const statusCode = err.statusCode || StatusCodes.INTERNAL_SERVER_ERROR; res.status(statusCode).json({ statusCode: statusCode, message: err.message || 'Internal server error', path: req.originalUrl, timestamp: new Date().toISOString(), ...(process.env.NODE_ENV === 'development' && { stack: err.stack }) });}
module.exports = databaseErrorHandler;
// middleware/error-handler.jsconst { StatusCodes } = require('http-status-codes');const { PrismaClientKnownRequestError } = require('@prisma/client/runtime');
// Combine error logging and handling in one middlewarefunction databaseErrorHandler(err, req, res, next) { // Handle Prisma errors if (err instanceof PrismaClientKnownRequestError) { console.error('Prisma Error:', { code: err.code, message: err.message, meta: err.meta });
// Handle common Prisma error codes switch (err.code) { case 'P2002': // Unique constraint violation return res.status(StatusCodes.CONFLICT).json({ statusCode: StatusCodes.CONFLICT, message: 'A record with this information already exists', path: req.originalUrl, timestamp: new Date().toISOString(), fields: err.meta?.target }); case 'P2025': // Record not found return res.status(StatusCodes.NOT_FOUND).json({ statusCode: StatusCodes.NOT_FOUND, message: 'Record not found', path: req.originalUrl, timestamp: new Date().toISOString() }); default: // Handle other Prisma errors } }
// Default error response const statusCode = err.statusCode || StatusCodes.INTERNAL_SERVER_ERROR; res.status(statusCode).json({ statusCode: statusCode, message: err.message || 'Internal server error', path: req.originalUrl, timestamp: new Date().toISOString(), ...(process.env.NODE_ENV === 'development' && { stack: err.stack }) });}
module.exports = databaseErrorHandler;
Logging Database Activity
Section titled “Logging Database Activity”// MongoDB query loggingif (process.env.NODE_ENV === 'development') { mongoose.set('debug', { color: true, filter: (query) => { // Exclude health checks or specific queries return !query.includes('system.version'); } });}
// Prisma query loggingconst prisma = new PrismaClient({ log: process.env.NODE_ENV === 'development' ? [ { emit: 'event', level: 'query' }, { emit: 'stdout', level: 'error' }, { emit: 'stdout', level: 'info' }, { emit: 'stdout', level: 'warn' }, ] : ['error', 'warn'],});
// Log slow queries in developmentif (process.env.NODE_ENV === 'development') { prisma.$on('query', (e) => { if (e.duration > 100) { // Log queries taking longer than 100ms console.warn(`Slow query (${e.duration}ms): ${e.query}`); } });}
Error Handling Best Practices
Section titled “Error Handling Best Practices”- Categorize errors: Differentiate between operational errors (e.g., validation failures) and programming errors
- Use domain-specific error classes: Create custom error types for your application domains
- Log detailed context: Include relevant data for debugging without exposing sensitive information
- Implement retry mechanisms: Use exponential backoff for transient failures
- Implement monitoring: Set up alerts for database errors, slow queries, and connection issues
- Create error response standards: Maintain consistent error response formats across your API
Conclusion
Section titled “Conclusion”Database Selection Decision Tree
Section titled “Database Selection Decision Tree”-
Do you need a relational model?
- Yes: Consider PostgreSQL, especially if you need complex queries, joins, or transactions
- No: Consider MongoDB for document-oriented data or when schema flexibility is important
-
What’s your application’s read/write pattern?
- Heavy reads: PostgreSQL with proper indexing
- Heavy writes: MongoDB or PostgreSQL with write optimization
- Balanced: Either option works well
-
Do you need advanced data integrity?
- Yes: PostgreSQL with its constraints and transactions
- No: MongoDB might offer more flexibility
-
Is rapid development a priority?
- Yes: MongoDB might allow faster iterations with its schema flexibility
- No: PostgreSQL’s structure enforces better data discipline
Remember that the right database choice depends on your specific application requirements, team expertise, and long-term maintenance considerations.