Medium Common Node.js Interview Questions
1. How do you mitigate 'Callback Hell'?
The most effective solution is using Promises and async/await syntax to flatten the code structure. Additionally, modularizing code by breaking complex anonymous callbacks into named functions helps improve readability and testability.
2. Explain Event-Driven Programming in the context of Node.js.
It's a paradigm where the flow of the program is determined by events (user actions, I/O completion). Node.js utilizes the EventEmitter to decouple components. One part of the app emits a signal ('user_registered'), and independent listeners react to it (send email, update stats). This promotes loose coupling.
3. Detail the phases of the Event Loop.
The Event Loop isn't a single cycle. It has phases: 1. Timers (setTimeout), 2. Pending Callbacks (system errors), 3. Idle/Prepare (internal), 4. Poll (retrieving new I/O events - the most critical phase), 5. Check (setImmediate), and 6. Close Callbacks. Understanding this order is vital for predicting execution timing.
4. How does the Event Loop prioritize different asynchronous tasks?
Microtasks (Promises, process.nextTick) have the highest priority and are executed immediately after the current operation, before the Event Loop continues to the next phase. Macrotasks (Timers, I/O callbacks) are processed in their respective phases. Misusing microtasks can starve the event loop.
5. What is Libuv?
Libuv is a multi-platform C++ support library. It abstracts the OS differences and provides the Event Loop implementation and the Thread Pool. It enables Node.js to perform non-blocking I/O operations uniformly across Windows, Linux, and macOS.
6. What is the difference between fork() and spawn()?
spawn() creates a new process and streams data (stdout/stderr), useful for running system commands. fork() is a specialized version of spawn for creating Node.js processes. Crucially, fork() establishes an IPC (Inter-Process Communication) channel, allowing the parent and child to exchange messages via .send().
7. When should you use Worker Threads?
Worker Threads are designed for CPU-intensive tasks (e.g., image resizing, heavy parsing). Unlike fork, which creates a new process with its own memory overhead, Worker Threads share the same process memory (via SharedArrayBuffer), making them more lightweight and efficient for parallel data processing.
8. How does Garbage Collection work in V8?
V8 uses a generational approach. New objects go to the 'New Space' (Scavenge algorithm, fast). Long-lived objects move to 'Old Space' (Mark-Sweep-Compact algorithm, slower). As a developer, you need to ensure references (timers, closures, listeners) are cleared so objects can be collected, otherwise, you cause memory leaks.
9. What are common security risks in Node.js and mitigations?
Common risks include Injection (SQL/NoSQL), XSS, and Dependency Vulnerabilities. Mitigations: Validate/sanitize all inputs (using Joi/Zod), use parameterized queries, implement security headers (Helmet), and regularly run npm audit to patch vulnerable dependencies.
10. How do you handle authentication strategies?
We typically use Passport.js or custom middleware. For REST APIs, JWT (stateless) is preferred for scalability. For MVC apps, session-based auth (stateful, stored in Redis) is common. Passwords should always be hashed using bcrypt or argon2, never stored in plain text.
11. What is CORS and how is it handled?
Cross-Origin Resource Sharing is a browser security mechanism restricting cross-domain requests. In Node.js, we handle this via middleware (like the cors package) which sets the Access-Control-Allow-Origin headers. In production, we should be specific about allowed origins rather than using a wildcard *.
12. How do you connect to a database?
We use drivers (e.g., pg, mongodb) or ORMs (Prisma, Mongoose). The critical part for a senior developer is managing the Connection Pool. We don't open/close connections per request; we reuse a pool of open connections to maintain performance.
13. How do you enforce code style consistency?
We use Linters (ESLint) for code quality and Formatters (Prettier) for style. These are enforced via CI pipelines and pre-commit hooks (Husky) to ensure no unstyled code reaches the repository.
14. Why use Express.js over native HTTP?
Native HTTP is low-level and requires significant boilerplate for parsing bodies, handling routes, and managing headers. Express provides a robust framework for Routing and Middleware management, significantly speeding up development time and reducing errors.
15. How do you handle errors centrally in Express?
By defining a custom error-handling middleware with four arguments: (err, req, res, next). Defined last in the middleware chain, this catches all errors passed to next(err), allowing for centralized logging and consistent error response formatting.
16. What is the purpose of the Express Router?
The Router creates modular, mountable route handlers. It allows us to split a monolithic app.js file into smaller, domain-specific files (e.g., userRoutes, productRoutes), making the codebase maintainable and easier to test.
17. How do you process CLI arguments?
Natively via process.argv. However, for complex tools, parsing the array manually is error-prone. We use libraries like Commander or Yargs to handle flags, defaults, and help documentation.
18. Significance of process.env.NODE_ENV?
It is the standard flag for environment detection. Setting it to 'production' drastically changes how libraries behave: Express caches view templates and CSS, logging becomes less verbose, and development-only stack traces are hidden.
19. How do you manage application secrets?
Secrets (API keys, DB passwords) must never be in version control. We inject them as environment variables at runtime. In production, we use secret management services like AWS Secrets Manager or HashiCorp Vault to securely inject these values.
20. How does Clustering work?
Since Node is single-threaded, a single instance runs on one CPU core. The cluster module forks the main process into worker processes (usually one per core), all sharing the same server port. This allows the app to utilize the full CPU capacity. Process managers like PM2 automate this.
21. What frameworks do you use for testing?
Jest is currently the industry standard due to its 'zero-config' setup and built-in mocking/assertions. Mocha and Chai are also popular for their flexibility. We write Unit tests for logic, Integration tests for API endpoints, and use libraries like supertest for HTTP assertions.
22. Explain the Test Pyramid.
It's a strategy dictating the distribution of tests. Base: Many Unit Tests (fast, isolated). Middle: Fewer Integration Tests (check interaction between modules/DB). Top: Very few End-to-End (E2E) tests (slow, simulate user behavior). This ensures a fast feedback loop.
23. GraphQL vs REST: When to use which?
REST is standard, caching-friendly, and simple. GraphQL is powerful for complex relational data where clients need to fetch specific fields to avoid over-fetching (getting too much data) or under-fetching (needing multiple requests). GraphQL shifts complexity to the server.
24. How do you implement file uploads?
We use middleware like multer to handle multipart/form-data. Ideally, we don't store files on the application server disk; we stream them directly to cloud storage (S3, GCS) to ensure the server remains stateless and scalable.
25. When are WebSockets preferred over HTTP?
WebSockets allow full-duplex, persistent connections. They are necessary for real-time applications (Chat, Live Feeds, Gaming) where the server needs to push data to the client without the client requesting it, avoiding the overhead of HTTP polling.
26. How do you manage multiple async tasks concurrently?
Using Promise.all() to run tasks in parallel and wait for all to finish, or Promise.allSettled() if we want to proceed even if some fail. Sequential await inside a loop should be avoided unless dependencies exist between iterations.
27. How does Node communicate with external APIs?
Using HTTP clients like axios or the native fetch API (available in newer Node versions). Key considerations include handling timeouts, retries with exponential backoff, and respecting rate limits of the external service.
28. What is the crypto module used for?
It provides cryptographic functionality like hashing (SHA256) and encryption (AES). While good for data encryption, for password hashing specifically, we prefer specialized libraries like bcrypt or scrypt which are computationally expensive by design to resist brute-force attacks.
29. Role of the Thread Pool in Node.js architecture.
The Thread Pool handles blocking I/O operations (File System, DNS, compression) and CPU-bound crypto tasks. This allows the main Event Loop to remain non-blocking. The default pool size is 4, but can be tuned via UV_THREADPOOL_SIZE.
30. How do Promises and Async/Await manage control flow?
They flatten the control flow, making it read top-to-bottom. Promises allow for chaining (.then()), while Async/Await pauses the execution of the function (non-blocking) until the Promise resolves, creating code that is easier to reason about.
31. How can you manage packages in a Node.js project?
Packages are managed via a package.json file. We use package managers like NPM, Yarn, or PNPM to install libraries. These tools handle dependency resolution (flattening the tree), semantic versioning, and script execution. Best practice involves committing the lockfile (package-lock.json or yarn.lock) to ensure the exact same versions are installed in CI/CD.
32. What is REPL in the context of Node.js?
REPL stands for Read-Eval-Print Loop. It is the interactive shell invoked by typing node in the terminal. It reads JS code, evaluates it line-by-line, prints the result, and loops back. It is incredibly useful for testing snippets, debugging logic, or checking environment capabilities without creating a file.
33. What is event-driven programming?
It is a programming paradigm where the flow of execution is determined by events (e.g., user interaction, incoming network request, file read complete). Node.js implements this via the EventEmitter class. Objects emit named events, and listeners (callbacks) subscribed to those events react to them asynchronously.
34. What are the common challenges faced in Node.js development?
- Blocking the Event Loop: Accidental CPU-heavy tasks can freeze the entire server.
- Callback Hell: Poorly structured async code (mitigated by Promises).
- Error Handling: Uncaught exceptions can crash the process; robust error management is required.
- Memory Leaks: References in closures or global variables preventing Garbage Collection.
35. What are the key features of Node.js?
Key features include: Asynchronous/Non-blocking I/O (handling high concurrency), Single-threaded Event Loop (low resource overhead), Cross-Platform (runs on Windows/Linux/Mac), V8 Engine (fast execution), and a robust Module System (CommonJS/ESM) supported by NPM.
36. What is 'Callback Hell' and how do you avoid it?
Callback Hell refers to heavily nested callbacks that make code difficult to read and maintain (Pyramid of Doom). We avoid it by:
- Using Promises and Async/Await to flatten control flow.
- Modularizing code into small, reusable functions.
- Using control flow libraries (like
async) if managing legacy callback-based APIs.
37. What is the meaning of the Test Pyramid?
The Test Pyramid is a framework for testing strategy. It advocates for a large base of Unit Tests (fast, isolated), a medium layer of Integration Tests (testing module interactions), and a small peak of End-to-End (E2E) Tests (slow, user simulation). This structure ensures confidence in the code while maintaining a fast feedback loop.
38. What do you understand about middleware in Node.js (specifically Express)?
Middleware functions are functions that have access to the request object (req), the response object (res), and the next function in the application’s request-response cycle. They can execute code, modify the request/response objects, end the request-response cycle, or call the next middleware in the stack (e.g., for logging, auth, or parsing JSON).
39. How many types of streams are available in Node.js?
There are four fundamental stream types:
- Readable (sources of data, e.g.,
fs.createReadStream). - Writable (destinations for data, e.g.,
fs.createWriteStream). - Duplex (both readable and writable, e.g., TCP sockets).
- Transform (duplex streams that modify data as it is written and read, e.g., zlib compression).
40. What is the Reactor Pattern in Node.js?
The Reactor Pattern is the architectural pattern Node.js uses to handle non-blocking I/O. An Event Demultiplexer listens for I/O requests. When a request comes in, it's delegated to the hardware. When the operation completes, the demultiplexer pushes the associated callback handler into the Event Queue, which the Event Loop then executes.
41. What are LTS releases of Node.js?
LTS stands for Long-Term Support. Even-numbered versions (e.g., 18, 20) enter LTS status and are supported for 30 months with critical bug and security fixes. For enterprise/production applications, you should always use the Active LTS version to ensure stability and security.
42. How does Node.js work under the hood?
Node.js is a runtime environment built on Chrome's V8 JavaScript engine. It uses an event-driven, non-blocking I/O model. This means it operates on a single main thread using an Event Loop. When an asynchronous operation (like a DB query or file read) is initiated, Node.js offloads it to the system kernel or a worker thread (via libuv). The main thread continues executing subsequent code. Once the operation completes, a callback is pushed to the callback queue, which the Event Loop pushes back onto the main stack when it's empty.
43. Why is Node.js designed to be single-threaded?
Node.js was designed to be single-threaded to avoid the complexity and overhead of thread context switching and thread safety issues (like race conditions and deadlocks) common in multi-threaded environments. By utilizing a single-threaded Event Loop for orchestration and delegating I/O tasks asynchronously, it achieves high scalability and throughput for I/O-bound applications without the heavy memory footprint of creating a new thread for every request.
44. Why is Node.js often preferred over traditional blocking technologies like Java or PHP for specific use cases?
Node.js is preferred for I/O-heavy, real-time, and data-streaming applications because of its non-blocking architecture. In traditional models (like older PHP or multi-threaded Java servers), each request spawns a thread; if that thread waits for I/O, CPU cycles and RAM are wasted. Node.js handles these with a single thread, making it highly efficient for high-concurrency scenarios like chat apps or streaming services. Additionally, using JavaScript on both frontend and backend unifies the development stack.
45. What is the V8 engine and how does it relate to Node.js?
V8 is Google's open-source high-performance JavaScript and WebAssembly engine, written in C++. It is the core component that powers Node.js. V8 compiles JavaScript code directly into native machine code (JIT compilation) rather than interpreting it, which results in extremely fast execution. Node.js adds bindings to V8 to allow JavaScript to interact with the operating system (file system, network, etc.), which browsers typically restrict.
46. Can you explain Control Flow in the context of asynchronous Node.js?
Control flow in Node.js refers to the order in which functions call and execute, particularly when dealing with asynchronous operations. Because async tasks finish at unpredictable times, managing the execution order is critical. We manage this using patterns like Callbacks (historically), Promises, and modern Async/Await syntax. Control flow libraries (like 'async') were also popular before Promises became standard to handle executing tasks in series, parallel, or waterfall sequences.
47. What are the main architectural disadvantages of Node.js?
The primary disadvantage is its unsuitability for CPU-intensive tasks (like video encoding or complex math). Since it uses a single thread, a long-running calculation blocks the event loop, halting all incoming requests. Another issue is 'Callback Hell' (though mitigatable). Additionally, the instability of the API in early versions (less of an issue now) and the sheer volume of low-quality packages in NPM can be management challenges.
48. Explain Promises in Node.js and how they improve upon callbacks.
A Promise is an object representing the eventual completion (or failure) of an asynchronous operation and its resulting value. It has three states: pending, fulfilled, or rejected. Promises improve upon callbacks by providing a cleaner syntax (chaining .then() and .catch()) and avoiding 'Callback Hell' (deeply nested callbacks). They are the foundation for the modern async/await syntax, which makes asynchronous code look synchronous.
49. What is Event-Driven Programming and how is it implemented in Node.js?
Event-driven programming is a paradigm where the flow of the program is determined by events (like user actions, sensor outputs, or messages from other programs). Node.js implements this via the EventEmitter class. Objects can emit named events, and listeners can subscribe to those events with callback functions. This decouples the code; the emitter doesn't need to know who is listening, just that an event occurred.
50. What is a Buffer in Node.js and why is it necessary?
A Buffer is a global class used to handle binary data directly. JavaScript was traditionally text-based (UTF-8 strings). Buffers provide a way to interact with streams of binary data (like TCP streams, file system operations, or image processing) by allocating a fixed amount of memory outside the V8 heap. They store raw data as a sequence of integers (bytes).
51. Explain the functionality of the Crypto module.
The crypto module provides cryptographic functionality that includes a set of wrappers for OpenSSL's hash, HMAC, cipher, decipher, sign, and verify functions. It is used to secure passwords (hashing), generate tokens, encrypt/decrypt data, and handle digital signatures. It is essential for security implementation in Node.js applications.
52. Explain the use of the Timers module in Node.js.
The timers module exposes global functions to schedule code execution. setTimeout executes code after a delay. setInterval executes code repeatedly with a delay. setImmediate executes code at the end of the current event loop cycle (Check phase). process.nextTick (though strictly part of process, not timers) executes immediately after the current operation completes.
53. Explain the role of the Passport module.
Passport is a flexible authentication middleware for Node.js. It modularizes authentication strategies. Instead of writing custom logic for Facebook login, Google login, or JWT handling, you simply plug in the relevant 'Strategy'. It standardizes how user sessions and credentials are handled in Express-based applications.
54. What is the 'fork' method in Node.js?
The fork method is part of the child_process module. It is used to spawn a new Node.js process (invoking a V8 instance). It enables inter-process communication (IPC) between the master and the worker process. This is the underlying mechanism used by the Cluster module to scale Node.js applications across multiple CPU cores.
55. What are the three primary methods to avoid Callback Hell?
- Modularization: Breaking callbacks into independent, named functions.
- Promises: Using
.then()chains to handle sequences linearly. - Async/Await: Using syntax sugar over Promises to write asynchronous code that reads like synchronous code, which is the modern standard.
56. What is CORS and how is it handled in Node.js?
CORS (Cross-Origin Resource Sharing) is a browser security feature that restricts cross-origin HTTP requests. If your Node API is on port 3000 and your frontend is on port 8080, the browser blocks the request by default. In Node/Express, we handle this using the cors middleware, which sets the appropriate headers (like Access-Control-Allow-Origin) to tell the browser to allow the request.
57. Explain the TLS module in Node.js.
The tls (Transport Layer Security) module provides an implementation of TLS and SSL protocols that sit on top of TCP. It allows Node.js to establish secure, encrypted connections. It is used heavily in the https module to serve secure web traffic.
58. What is the purpose of the NODE_ENV variable?
NODE_ENV is an environment convention used to specify the application's running mode (usually 'development', 'production', or 'test'). Many libraries check this variable to optimize behavior. For example, setting it to 'production' in Express disables stack trace dumps on errors and enables view caching, improving performance and security.
59. What is the Test Pyramid and how does it apply to Node.js?
The Test Pyramid is a strategy for software testing. The base is Unit Tests (fast, isolated, cover individual functions - e.g., using Jest). The middle is Integration Tests (verify modules work together, e.g., API endpoint to DB). The top is End-to-End (E2E) Tests (slow, test full user flow, e.g., using Cypress/Playwright). In Node, we aim for many unit tests, fewer integration tests, and very few E2E tests.
60. What is Piping in Node.js Streams?
Piping is a mechanism where the output of one stream is connected directly to the input of another stream using the .pipe() method. It manages the flow of data automatically so that the destination is not overwhelmed by the source (backpressure). For example: readStream.pipe(gzip).pipe(writeStream) reads a file, compresses it, and writes it to disk efficiently.
61. How do you manage user sessions in a stateless Node.js environment?
Since HTTP is stateless, we manage sessions using either Server-side Sessions (storing a session ID in a cookie and the data in Redis/DB via express-session) or Client-side Tokens (JWTs). JWTs are more scalable for microservices as they don't require a central lookup; the token contains the data and is signed by the server.
62. How can we implement Authentication vs Authorization in Node.js?
Authentication (Who are you?) is handled via libraries like Passport or manual bcrypt comparison + JWT generation. Authorization (What are you allowed to do?) is handled via middleware that decodes the JWT/session, checks the user's role (e.g., 'admin'), and calls next() if allowed or returns 403 Forbidden if not.
63. Difference between Node.js and Python for backend development?
Node.js uses an event-driven, non-blocking I/O model making it superior for real-time apps (chat, streaming) and high concurrency. Python uses a synchronous, multi-threaded model (typically) which can be slower for I/O but excels in CPU-heavy tasks, data science, and AI integration. Node.js offers the benefit of using the same language (JS) on front and back ends.
64. Explain the use of Redis with Node.js.
Redis is an in-memory data structure store used as a database, cache, and message broker. In Node.js, it is primarily used for Caching (storing expensive DB query results for fast retrieval), Session Storage (sharing sessions across clustered server instances), and Pub/Sub (real-time messaging queues).
65. What are WebSockets and how do they differ from HTTP?
HTTP is uni-directional (Client requests -> Server responds -> Connection closes). WebSockets (WS) provide a full-duplex, persistent communication channel over a single TCP connection. The server can push data to the client without a request. This is essential for real-time apps like chat, live sports updates, or gaming.
66. Explain the Util module in Node.js.
The util module provides utility functions primarily for internal Node.js use and debugging. Common uses include util.promisify() (converts callback-based functions to Promises), util.format() (string formatting), and util.types (checking data types reliably).
67. Explain the DNS module in Node.js.
The dns module enables name resolution (looking up IP addresses from domain names) and reverse lookups. It has two categories:
- Using the underlying OS facilities (like
dns.lookup), and - Connecting to an actual DNS server to perform name resolution (like
dns.resolve).
68. What is an EventEmitter and how is it customized?
The EventEmitter class (from the 'events' module) is the core of Node's event-driven architecture. You create a class that extends EventEmitter to make your own objects emit events. You use .emit('eventName', data) to trigger an event and .on('eventName', callback) to listen for it. It creates a loose coupling between the action and the reaction.
69. When should you use NPM versus Yarn?
Historically, Yarn was created to solve NPM's performance and determinism issues. Today, both are mature and feature-parity is high (both support lockfiles, workspaces, etc.). The choice is often a team preference or based on specific features like 'Plug'n'Play' (Yarn) vs standard node_modules (NPM). The critical rule is to never mix them in the same project to avoid lockfile conflicts.
70. What is a Stub in testing? Name a use case.
A Stub is a test double that replaces a real component and provides canned answers to calls during the test. Unlike a Mock (which verifies behavior), a Stub just provides data. Use case: Stubbing an external payment gateway API to always return 'Success' so you can test your order processing logic without actually charging a credit card or hitting the network.
71. What is the Test Pyramid?
It is a framework for balancing test coverage. The base is Unit Tests (numerous, fast, cheap to maintain). The middle is Integration Tests (fewer, test interactions). The tip is End-to-End (E2E) Tests (very few, slow, expensive). Following this prevents the 'Testing Ice Cream Cone' anti-pattern where slow E2E tests dominate, slowing down deployment.
72. Which HTTP framework do you prefer and why?
For most standard applications, Express is the choice due to its massive ecosystem and community support. However, for high-performance microservices, Fastify is preferred due to its lower overhead and built-in schema validation. For large-scale enterprise architecture, NestJS is excellent as it enforces a structured, Angular-like pattern with dependency injection.
73. When are background/worker processes useful and how do you handle them?
They are essential for any task that takes longer than a few milliseconds (emailing, PDF generation, data analysis) to prevent blocking the main HTTP request. We handle them using Message Queues (like RabbitMQ or BullMQ/Redis). The main app pushes a job to the queue, and a separate worker process consumes and executes it asynchronously.
74. How can you secure your HTTP cookies against XSS attacks?
You must set the HttpOnly flag, which prevents client-side JavaScript (and thus malicious XSS scripts) from accessing the cookie. Additionally, set the Secure flag to ensure it's only sent over HTTPS, and the SameSite attribute (to Strict or Lax) to prevent Cross-Site Request Forgery (CSRF).
75. How can you ensure your project dependencies are safe?
Automate the process. Use npm audit or yarn audit in your CI/CD pipeline to break builds on high-severity vulnerabilities. Integrate tools like Snyk or Dependabot/Renovate to automatically open PRs for security updates. Also, avoid using abandoned packages or those with few maintainers.
76. Puzzle: What is wrong with this Promise code? new Promise((resolve, reject) => { throw new Error('error') }).then(console.log)
The code lacks a .catch() block. While the Promise constructor implicitly catches the synchronous error and rejects the promise, there is no handler for that rejection. In Node.js, this results in an 'UnhandledPromiseRejectionWarning' and, in modern versions, will terminate the process with a non-zero exit code, crashing the application.
77. What is fork in Node.js?
fork() is a method in the child_process module used to spawn a new Node.js process. Unlike spawn, fork creates a communication channel (IPC) between the parent and child, allowing them to send messages back and forth. It is useful for offloading CPU-intensive tasks to a separate process to avoid blocking the main event loop.
78. In the async utility library, what arguments does async.queue take?
async.queue creates a queue object with a specified concurrency. It takes two arguments: a Worker Function (which processes the tasks) and a Concurrency integer (the number of tasks allowed to run in parallel).
79. Explain the concept of a Stub in Node.js testing.
A Stub is a test double used to replace a real dependency (like a database call or API request) with a pre-programmed response. It ensures tests are fast, deterministic, and isolated. For example, instead of actually calling an external weather API, you stub the function to simply return { temp: 72 } immediately.
80. Describe the Exit Codes of Node.js.
Exit codes tell the OS why the process stopped. Code 0 means success/normal termination. Code 1 is for Uncaught Fatal Exceptions. Other common ones are 5 (V8 Fatal Error) or 137 (Out of Memory/Kill Signal). You can manually set this via process.exit(code).
81. Why does Node.js use Google's V8 Engine?
V8 is chosen for its performance. It compiles JavaScript directly into native machine code using Just-In-Time (JIT) compilation rather than interpreting it. This speed, combined with V8's efficient memory management and rapid innovation cycle (driven by Chrome), makes it the ideal engine for a high-performance runtime like Node.
82. Why should you separate the Express 'App' and 'Server' definition?
Separation of concerns and testability. We define the app (routes, middleware) in one file and the server (network listening) in another. This allows us to import app into our test suite (using Supertest) without actually opening a network port, while the entry file simply imports app and calls .listen().
83. Explain the Reactor Pattern in Node.js.
The Reactor Pattern is the architectural heart of Node.js. It handles concurrent I/O operations by observing resources. When an I/O request arrives, it is submitted to the Demultiplexer. Once the operation is ready (e.g., file read complete), the Demultiplexer pushes the associated Event Handler (callback) to the Event Queue, which the Event Loop then executes.
84. What is Middleware in the context of Node.js/Express?
Middleware functions are the building blocks of an Express app. They intercept the Request-Response cycle. They can execute code, modify the request (e.g., parsing JSON body), authenticate users, or handle errors. They must call next() to pass control to the next middleware, or the request will hang.
85. What are Node.js Buffers?
Buffers are temporary memory storage for raw binary data. While JS handles Unicode strings, Buffers handle streams of binary data (like images, archives, or TCP streams). They are allocated outside the V8 heap for performance and interact directly with memory.
86. What are Node.js Streams?
Streams are objects that let you read data from a source or write data to a destination in continuous chunks. There are four types: Readable, Writable, Duplex, and Transform. They are memory efficient because they don't load the entire file into RAM at once.
87. How do we use async/await in Node.js?
We mark a function with async, which ensures it returns a Promise. Inside, we use await before a Promise to pause execution until that Promise resolves. This allows us to write asynchronous code that looks synchronous and use standard try/catch blocks for error handling.
88. How does Node.js overcome the blocking of I/O operations?
It uses the Libuv library to offload blocking I/O tasks (like file system access or DNS lookups) to a pool of background threads. The main JavaScript thread sends the task to Libuv and continues working. When the background thread finishes the task, it signals the Event Loop to execute the callback.
89. Differentiate between process.nextTick() and setImmediate().
process.nextTick() is not technically part of the event loop phases; it runs immediately after the current operation finishes, processing its queue until drained (potentially blocking I/O). setImmediate() runs in the 'Check' phase of the Event Loop cycle, making it fairer and preventing I/O starvation.
90. If Node.js is single-threaded, how does it handle concurrency?
It handles concurrency via the Event Loop and Non-blocking I/O. It doesn't run execution threads in parallel for JS code; instead, it rapidly switches between tasks. When a task waits for I/O, Node delegates that waiting to the OS/Kernel and moves to the next request. This allows a single thread to manage thousands of pending connections.
91. What is the Event Loop in Node.js?
The Event Loop is a mechanism that enables non-blocking I/O. It constantly monitors the Call Stack and the Callback Queue. If the Call Stack is empty, it dequeues events from the Callback Queue and pushes them to the Stack for execution, cycling through specific phases (Timers, Pending, Poll, Check, Close).
92. What do you understand by 'Callback Hell'?
It is an anti-pattern where multiple nested callbacks create a 'pyramid of doom' shape in the code, making it unreadable and difficult to debug. We solve this by modularizing functions or using Promises and Async/Await.
93. Differentiate between process.nextTick(), setImmediate(), and setTimeout() in the context of the Event Loop phases.
process.nextTick() is not part of the Event Loop; it fires immediately after the current operation completes but before the Event Loop continues, potentially starving I/O if misused. setTimeout() belongs to the 'Timers' phase and executes after a minimum threshold. setImmediate() belongs to the 'Check' phase and is designed to run after the 'Poll' (I/O) phase. In a real-world server, use setImmediate to queue tasks that should run after I/O callbacks, and nextTick only for urgent error handling or state clearing that must happen before anything else.
94. Explain the difference between fork() and spawn() in the child_process module. When would you use fork?
spawn() creates a new process and executes a command (like running a shell script or Python job), communicating via streams (stdin/stdout). It is efficient for generic tasks. fork() is a specialized version of spawn() specifically for Node.js modules. Crucially, fork() establishes a built-in IPC (Inter-Process Communication) channel between parent and child, allowing the exchange of messages/objects. You use fork() when you need to offload CPU-intensive Node.js tasks (like image processing or heavy math) to a separate process to avoid blocking the main Event Loop.
95. What are Streams in Node.js and why are they critical for handling large data payloads?
Streams are an abstract interface for working with streaming data. Unlike fs.readFile, which buffers an entire file into memory (risking out-of-memory crashes with large files), Streams read/write data in small chunks. This keeps the memory footprint low and constant, regardless of file size. They are critical for high-performance apps, such as video streaming servers or ETL pipelines. Piping (src.pipe(dest)) also handles 'backpressure' automatically, pausing the read stream if the write stream cannot keep up.
96. What is the purpose of NODE_ENV and how does it impact the behavior of libraries like Express?
NODE_ENV is an environment variable used to signal the execution context (e.g., 'development', 'production'). Setting NODE_ENV=production is a critical performance optimization. In Express.js, this flag disables stack trace generation for errors (security), caches view templates (performance), and generates less verbose logs. Failing to set this in production can lead to significantly slower response times and security vulnerabilities due to verbose error leakage.
97. What is the 'Buffer' class, and why is it distinct from standard JavaScript strings?
Standard JS strings are UTF-16 encoded and managed by the V8 heap. Buffers are chunks of raw memory allocated outside the V8 heap, used to handle binary data (TCP streams, file system operations, image processing). They are fixed-length and interact directly with system memory. Senior developers must be careful with Buffer.allocUnsafe(), which is faster but may contain sensitive data from previously used memory, posing a security risk if not immediately overwritten.
98. How would you secure a Node.js application? List key implementations.
Security is multi-layered. Key implementations include:
- Helmet.js: Sets secure HTTP headers (e.g., HSTS, X-Frame-Options).
- Input Validation: Using libraries like Joi/Zod to sanitize all incoming data (preventing Injection/XSS).
- Rate Limiting: Using
express-rate-limitto prevent DDoS/Brute Force. - Dependency Auditing*: Running
npm auditto patch vulnerable packages. - CORS: Strictly defining allowed origins.
- Secrets Management: Never committing
.envfiles; using Vault or AWS Secrets Manager.
99. What is the Test Pyramid, and how should it be implemented in a Node.js API?
The Test Pyramid suggests a distribution of tests: Unit Tests (Base, 70%) using Jest/Mocha to test individual functions in isolation; Integration Tests (Middle, 20%) using Supertest to verify API endpoints and database interactions; and E2E Tests (Top, 10%) simulating full user flows. In Node.js, because of the lack of compile-time type checking (unless using TS), a strong base of unit tests is vital to catch logic errors early.
100. What is the difference between readFile (buffer) and createReadStream (stream)?
fs.readFile reads the entire file content into memory (RAM) before making it available. This is fast for small configuration files but disastrous for large files (e.g., 500MB logs) as it spikes memory usage. fs.createReadStream reads the file in small chunks (default 64kb), emitting 'data' events. This allows processing gigabytes of data with constant, low memory usage and is the standard for production I/O operations.
101. How does package-lock.json differ from package.json, and why is it critical for CI/CD pipelines?
package.json defines the functional dependencies and their allowed version ranges (e.g., ^1.2.0). However, this can lead to 'drift' where different developers or CI servers install slightly different versions of nested dependencies. package-lock.json locks down the exact version of every package and sub-dependency installed in the node_modules tree. In CI/CD, you should use npm ci (Clean Install) instead of npm install; npm ci reads strictly from the lockfile, ensuring that the production build is bit-for-bit identical to the development environment.
102. Explain the difference between require() and import (ES Modules) in Node.js.
require() is part of the CommonJS (CJS) system, which is synchronous and dynamic (can be called conditionally inside functions). import is part of ECMAScript Modules (ESM), which is asynchronous and static (must be at the top level, allowing for static analysis and tree-shaking). While Node.js now supports ESM natively, many legacy projects still use CJS. A key difference is that CJS modules export a copy of values, whereas ESM exports live bindings (if the exported value changes, the importer sees the change).
103. How do you securely manage authentication in a stateless Node.js API?
In a stateless architecture (REST/GraphQL), we typically use JSON Web Tokens (JWT). When a user logs in, the server signs a token containing their ID (payload) using a secret key (HMAC) or private key (RSA). This token is sent to the client. For subsequent requests, the client sends the token in the Authorization: Bearer header. The server verifies the signature without needing to check a database session store. Best practices include: short expiration times (15min), using Refresh Tokens for long-lived sessions, and never storing sensitive data in the JWT payload.
104. What is Semantic Versioning (SemVer) and what do the caret (^) and tilde (~) symbols mean?
SemVer is a versioning standard: Major.Minor.Patch (e.g., 1.2.3). Patch fixes bugs (backward compatible); Minor adds features (backward compatible); Major breaks changes. In package.json: Tilde (~1.2.3) allows patch updates (1.2.x). Caret (^1.2.3) allows minor updates (1.x.x) but freezes the major version. The caret is the default NPM behavior, as it assumes minor updates are safe, but a cautious senior developer might pin exact versions to avoid unexpected breakage.
105. Explain the role of inspect and the Node.js Debugger.
Node.js has a built-in out-of-process debugging utility accessible via node inspect script.js. However, modern debugging typically uses the V8 Inspector Protocol via the --inspect flag. This opens a WebSocket connection that tools like Chrome DevTools or VS Code can connect to. It allows you to set breakpoints, step through code, inspect memory heaps, and profile CPU usage in real-time, which is far superior to debugging via console.log.
106. How do you implement a 'Stub' in Node.js testing?
A Stub is a test double that replaces a real component (like a database function or API call) with a controllable replacement. Unlike a Mock (which verifies behavior), a Stub is used to force a specific state or return value to test how your code handles it. For example, you might stub fs.readFile to simulate a 'File Not Found' error to ensure your error handling logic works, or stub an API call to return a specific JSON object immediately without hitting the network. Libraries like Sinon.js are standard for this.
107. How does zlib handle compression in Node.js?
The zlib module provides compression functionality (Gzip, Deflate, Brotli) implemented using C++ bindings for speed. It is essential for optimizing HTTP responses. In Express, you might use the compression middleware (which uses zlib under the hood) to compress JSON or HTML responses before sending them to the client. This significantly reduces payload size (often by 70%+) and improves load times. It can process data using Streams, allowing you to compress large files on the fly without buffering them.
108. What is the difference between ESLint and Prettier? Why use both?
ESLint is a Linter: it analyzes code for logical errors, bugs, and bad practices (e.g., using variables before declaration, unused imports). Prettier is a Formatter: it strictly enforces stylistic choices (e.g., indentation, semicolons, quotes) to ensure consistency. While they overlap slightly, best practice is to use both: let Prettier handle the formatting (how code looks) and ESLint handle the code quality (how code works), often integrating them via eslint-config-prettier to prevent conflicts.
109. Explain the standard File System flags r, w, a, and w+. When would you use them?
These flags determine how a file is opened.
r: Open for reading (fails if file doesn't exist).w: Open for writing (creates file if missing, or truncates (overwrites) existing content).a: Open for appending (creates file if missing, writes to the end).w+: Open for reading and writing (truncates file immediately). A senior developer usesafor logs (to keep history) andwfor saving new configuration files where total replacement is desired.
110. What is Passport.js, and what is a 'Strategy' in this context?
Passport.js is the standard authentication middleware for Node.js. It is modular and unopinionated. A 'Strategy' is a pluggable module that implements a specific authentication logic. Instead of writing auth code from scratch, you install strategies like passport-local (username/password), passport-jwt (token-based), or passport-google-oauth20 (Social Login). This allows developers to switch or combine auth methods (e.g., allowing both Google Login and Password Login) with minimal code changes.
111. How do you use util.promisify() and why is it useful for legacy Node.js APIs?
Many core Node.js modules (like fs, dns, zlib) originally used the 'Error-First Callback' pattern (e.g., fs.readFile(path, (err, data) => ...)). util.promisify() takes a function following this callback style and converts it into a function that returns a Promise. This allows you to use modern async/await syntax with older native libraries, keeping your codebase consistent and avoiding mixed callback/promise logic.
112. Compare npm, yarn, and pnpm. Why might you choose pnpm?
npm is the default package manager. yarn was created to address npm's speed and security issues (though npm has since caught up). pnpm (Performant NPM) is a modern alternative that uses a unique strategy: it uses hard links and symlinks to store all packages in a single global content-addressable store. This means if you have 10 projects using React, pnpm stores React only once on the disk, whereas npm and yarn would duplicate it 10 times. pnpm is significantly faster and saves massive disk space.
113. What is the role of npm audit and how do you resolve the vulnerabilities it finds?
npm audit scans your package-lock.json against a public database of known security vulnerabilities in your dependencies. It reports severity levels (Low to Critical). To resolve, you can run npm audit fix (which attempts non-breaking upgrades). For breaking changes or deep dependency issues, a senior developer must manually override versions using the overrides field (in package.json) or upgrade the parent package that relies on the vulnerable dependency.
114. Explain the difference between Soft Delete and Hard Delete in database design.
Hard Delete (SQL DELETE) permanently removes the record from the database; it is unrecoverable. Soft Delete keeps the record but marks it as deleted via a flag (e.g., isDeleted: true or deletedAt: timestamp). In enterprise Node.js apps, Soft Delete is preferred to maintain data integrity, allow for audit trails, and enable 'Undo' functionality. However, it requires all queries to include a where deletedAt is null filter to exclude deleted items.
115. How do you implement API Rate Limiting in Node.js and why?
Rate limiting restricts the number of requests a user/IP can make in a specific time window (e.g., 100 requests per 15 minutes). It protects against Brute Force attacks (on login) and Denial of Service (DoS) attacks. In Node.js, we typically use middleware like express-rate-limit. For distributed systems (multiple server instances), the rate limit state must be stored in a shared store like Redis, otherwise, the limit would apply only per server, not globally.
116. What is 'Control Flow' in Node.js and how has it evolved?
Control Flow refers to the order in which functions execute. Because Node is async, code doesn't necessarily run top-to-bottom. Evolution:
- Callbacks (Error-prone, Hell).
- Libraries like
Async.js(waterfall, series, parallel functions). - Promises (Chainable, cleaner).
- Async/Await (Standard, synchronous-style syntax).
A senior developer understands all four to maintain legacy code but writes new code using Async/Await.
117. Why is console.log discouraged in high-performance production applications?
console.log is synchronous when writing to terminals (TTY) and in some piping scenarios. Using it heavily in a high-traffic loop can block the Event Loop. Furthermore, it lacks structured data (timestamps, log levels, JSON format). Production apps should use asynchronous, structured logging libraries like Pino or Winston. These libraries buffer logs and write them asynchronously, often formatting them as JSON for ingestion by log management systems (like ELK Stack or Datadog).
118. Explain Idempotency in the context of REST APIs.
An operation is idempotent if making the same request multiple times produces the same result as making it once. In HTTP: GET, PUT, and DELETE should be idempotent. (Deleting a record twice results in the same state: the record is gone). POST is generally not idempotent (sending the same POST twice creates two resources). Understanding this is critical for handling network retries—if a client retries a POST, they might duplicate data, whereas retrying a PUT is safe.
119. Explain the concept of 'Sticky Sessions' when using Node.js Clustering or Load Balancers.
When scaling Node.js across multiple processes or servers, a user's requests might be handled by different instances. If you store session data in local memory (RAM), a request hitting Server B won't find the session created on Server A. 'Sticky Sessions' (Session Affinity) configures the Load Balancer to use the client's IP or a cookie to ensure all requests from a specific user are always routed to the same server instance. However, a better senior-level solution is to use a distributed session store (Redis) to make the backend stateless, eliminating the need for sticky sessions entirely.
120. What is the difference between call, apply, and bind in JavaScript?
These methods control the this context of a function. call() invokes the function immediately, accepting arguments one by one (comma-separated). apply() invokes the function immediately but accepts arguments as an array. bind() does not invoke the function; instead, it returns a new function with this permanently bound to the specified context, which can be executed later. bind is frequently used in React class components or partial application patterns.
121. Describe the Multi-Stage Build pattern for Dockerizing a Node.js application.
Multi-stage builds optimize Docker image size and security. Stage 1 (Build) uses a full Node image containing build tools (compilers, python) to install dependencies and build artifacts (e.g., TypeScript to JS). Stage 2 (Production) starts fresh with a minimal image (like node:alpine), copies only the necessary artifacts (dist folder, package.json, and production node_modules) from Stage 1. This discards the heavy build tools and source code, resulting in a tiny, secure image.
122. What is Cross-Site Request Forgery (CSRF), and how do you prevent it in Node.js?
CSRF is an attack where a malicious site tricks a user's browser into sending a request to a trusted site where the user is authenticated (using cookies). Because cookies are sent automatically, the server accepts the request. Prevention involves using CSRF Tokens: the server generates a unique token that must be included in the request body or headers (which the malicious site cannot access). Libraries like csurf (legacy) or csrf-csrf are used to implement this validation middleware.
123. What is the role of util.debuglog?
util.debuglog(section) creates a function that conditionally writes to stderr based on the NODE_DEBUG environment variable. For example, if you create const debug = util.debuglog('foo'), the logs only appear if you run the app with NODE_DEBUG=foo node app.js. This is the native Node.js equivalent of the popular debug npm package, allowing you to leave instrumentation code in production with zero performance overhead until it's explicitly enabled.
124. How do you handle file uploads in Node.js? Why is multer commonly used?
Node.js (and Express) does not parse multipart/form-data (file uploads) by default; it only handles JSON/URL-encoded bodies. To handle uploads, we use middleware like multer or formidable. multer is preferred because it processes the input as a Stream, allowing you to stream the uploaded file directly to disk or cloud storage (like S3) without loading the entire file into RAM, which is critical for preventing memory exhaustion attacks.
125. What are the security risks of using npm install without checking scripts?
NPM packages can contain 'Lifecycle Scripts' (like preinstall or postinstall) that execute automatically upon installation. Malicious packages use these to steal environment variables (AWS keys) or install malware. To mitigate this in CI/CD or when auditing, you can run npm install --ignore-scripts. Senior developers also use tools like Socket.dev or Snyk to analyze package behavior before introducing them to the codebase.
126. How do you ensure 'Graceful Shutdown' in Node.js?
Graceful shutdown ensures that when a server stops (deployment or crash), it finishes processing current requests before exiting. You listen for termination signals (SIGTERM, SIGINT). When received:
- Stop the server from accepting new connections (
server.close()). - Wait for existing connections to finish (or timeout).
- Close database connections/pools.
- Exit the process (
process.exit(0)).
This prevents users from getting 'Connection Reset' errors during deployments.
127. What is Memoization and how would you implement it?
Memoization is an optimization technique where you cache the results of expensive function calls and return the cached result when the same inputs occur again. In Node.js, you can implement this using a Map or WeakMap where keys are the arguments and values are the results. Libraries like lodash.memoize provide this. It is vital for optimizing recursive functions or heavy computations, but care must be taken to avoid memory leaks (cache invalidation is hard).
128. Why should you separate the Express 'App' and 'Server' files (e.g., app.js and server.js)?
Separating the App definition from the Server listening logic is a testing best practice. app.js should export the Express application (middleware, routes) without calling .listen(). server.js imports the app and starts the server. This allows you to import app into integration testing tools (like Supertest) to simulate requests without actually binding to a network port, preventing 'EADDRINUSE' errors during parallel test execution.
129. What is the difference between setImmediate() and process.nextTick() regarding recursion?
Recursive calls to process.nextTick() can cause I/O starvation because nextTick fires before the Event Loop moves to the next phase; the loop essentially gets stuck processing ticks forever. Recursive calls to setImmediate() are safe because they are queued for the next iteration of the Event Loop (Check phase), allowing I/O and timers to run in between. Use setImmediate for splitting long-running synchronous tasks.
130. What is a 'Race Condition' in Node.js, given that it is single-threaded?
While Node.js executes JS on a single thread, race conditions still occur in asynchronous operations involving shared external resources (Database, Files). Example: Two requests read a DB value (e.g., 'Stock: 1') simultaneously. Both see '1', both decrement it, and both write '0'. The stock should be '-1'. A senior developer solves this using Database transactions (row locking) or Atomic operators (like MongoDB's $inc) rather than relying on JS logic.
131. What are the pros and cons of using ORMs (like Sequelize/TypeORM) vs Native Drivers (pg/mongodb)?
ORMs offer portability (switch DBs easily), type safety, and auto-migrations, speeding up development. Cons: They add overhead (slower performance), generate inefficient queries for complex joins, and hide the underlying mechanics. Native Drivers offer maximum performance and full access to database-specific features but require writing raw queries and manual boilerplate. Senior devs often choose Native Drivers (or query builders like Knex) for high-performance microservices.
132. What is the difference between dependencies, devDependencies, and peerDependencies?
dependencies: Required to run. devDependencies: Required to build/test. peerDependencies: Used in library development; it tells the consumer 'I need this package (e.g., React), but I expect you to install it'. This prevents version conflicts (avoiding multiple copies of React in the same project). Failing to understand peerDependencies is a common source of plugin errors.
133. How do you handle floating point math in Node.js (e.g., 0.1 + 0.2)?
JavaScript uses IEEE 754 floating point numbers, so 0.1 + 0.2 !== 0.3 (it is 0.30000000000000004). In financial Node.js applications, never use standard numbers for money. Use libraries like Decimal.js, Big.js, or handle values as integers (cents) to ensure precision. Native BigInt can also be used for large integers, but not for decimals.
134. What is the 'Revealing Module Pattern'?
This is a design pattern (pre-ES6 modules) where you define variables and functions in a private scope (closure) and return an object exposing only the public methods. This mimics public/private access modifiers found in other languages. While less common now due to ES Modules and TypeScript, understanding it is useful for reading legacy libraries.
135. Why is Node.js often preferred over traditional multithreaded backends like Java or PHP for I/O-heavy applications?
Traditional models (like Java Spring Boot or PHP Apache) typically create a new thread for every incoming request. If a request blocks (e.g., waiting for a DB query), that thread consumes RAM and CPU while doing nothing. Node.js uses a single-threaded, non-blocking Event Loop. It can handle tens of thousands of concurrent connections with minimal overhead because it doesn't wait for I/O; it delegates it. This makes Node.js superior for real-time apps (chat, sockets) and I/O-heavy microservices, though Java/Go may still excel at raw CPU-bound processing.
136. What is the connect module, and how does it relate to Express.js?
connect is a legacy HTTP server framework that introduced the middleware plugin architecture (using .use()) to Node.js. It provided the foundation upon which Express.js was built. While Express extended Connect with routing, views, and other features, modern Express (v4+) has removed the direct dependency on Connect. However, understanding Connect is useful for maintaining very old Node.js applications or understanding the origin of the (req, res, next) middleware signature.
137. How do you securely update dependencies in a large Node.js project?
Simply running npm update can be risky. A senior developer follows a process:
- Run
npm outdatedto see available versions. - Consult the changelogs for 'Breaking Changes' (Major version bumps).
- Update one package at a time (or small groups).
- Run the full test suite (
npm test) after each update. - Use tools like
RenovateorDependabotto automate pull requests for updates, ensuring that CI checks pass before merging.
138. How do you define consistent code style across a team of developers?
Relying on discipline is insufficient. We enforce style via tooling: 1) EditorConfig: Ensures consistent indentation across different IDEs. 2) Prettier: Auto-formats code on save (handling wrapping, quotes, semicolons). 3) ESLint: Enforces code quality rules (e.g., 'no-unused-vars'). 4) Husky: A git hook that runs these tools before a commit (pre-commit), preventing 'messy' code from ever entering the repository.
139. What is the difference between setImmediate() and setTimeout()?
setTimeout() schedules a callback to run after a minimum delay (in milliseconds). setImmediate() schedules a callback to run specifically in the 'Check' phase of the Event Loop, which occurs immediately after the 'Poll' (I/O) phase. If you run both in a main module (not inside an I/O cycle), the order is non-deterministic (depends on process performance). However, inside an I/O callback (like a file read), setImmediate() is guaranteed to run before setTimeout(), regardless of the timer delay.
140. What are the limitations or 'Cons' of Node.js?
- CPU-Intensive Tasks: Because it is single-threaded, heavy computations (video encoding, complex math) block the Event Loop, halting the entire server.
- Callback Hell: Without discipline or modern patterns (Async/Await), code can become unreadable.
- Immaturity of some tooling: While core tools are stable, the NPM ecosystem can be volatile with abandoned or low-quality packages.
141. Differentiate between process.nextTick() and setImmediate() regarding their execution order in the event loop.
The naming can be counter-intuitive. process.nextTick() fires immediately after the current operation completes, but before the event loop continues to the next phase. It places callbacks in the nextTickQueue, which is processed before promises and before the event loop moves on. This can technically starve the I/O event loop if called recursively. setImmediate(), on the other hand, is designed to execute scripts once the current poll phase completes, specifically in the 'Check' phase of the event loop. In summary: nextTick is 'right now, before anything else', and setImmediate is 'on the next tick of the event loop'.
142. What are Streams in Node.js, and how do they improve performance when handling large data sets?
Streams are collections of data meant to be handled sequentially, chunk by chunk, rather than reading the entire dataset into memory at once. There are four types: Readable, Writable, Duplex, and Transform. Streams drastically improve performance and memory efficiency. For example, when reading a 2GB file to send to a client, standard buffering would crash the heap. Streams allow you to pipe data chunks to the response object as they are read (fs.createReadStream().pipe(res)), keeping memory usage low and constant regardless of file size. They also handle 'backpressure,' ensuring a fast source doesn't overwhelm a slow destination.
143. How can Node.js be scaled to utilize multi-core systems, and what are the trade-offs of using the Cluster module?
Since Node.js is single-threaded, a single instance runs on one CPU core. To scale, we use the cluster module (or process managers like PM2) to fork the main process into multiple worker processes, usually one per CPU core. The master process distributes incoming connections to workers. Trade-offs include increased memory usage (each worker is a distinct V8 instance) and the complexity of managing state; since workers do not share memory, state must be stored in an external store like Redis (for sessions/caching) rather than in-memory variables.
144. What are the essential security practices for hardening a Node.js web API?
Securing Node.js involves multiple layers:
- Headers: Use
helmetto set secure HTTP headers (HSTS, X-Frame-Options, etc.). - Input Validation: Validate all incoming data using libraries like
JoiorZodto prevent injection attacks (SQLi, NoSQLi). - Rate Limiting: Implement
express-rate-limitto prevent DDoS and brute-force attacks. - Dependencies: Run
npm auditto check for known vulnerabilities in the supply chain. - Error Handling: Never expose stack traces or internal implementation details to the client in production environment errors.
145. Explain the mechanics of a Promise in JavaScript and how Promise.all differs from Promise.allSettled.
A Promise is an object representing the eventual completion (or failure) of an asynchronous operation and its resulting value. It has three states: Pending, Fulfilled, and Rejected. Promise.all([p1, p2]) runs promises concurrently and rejects immediately if any of the input promises reject (fail-fast behavior). Promise.allSettled([p1, p2]) waits for all promises to finish, regardless of whether they resolved or rejected, returning an array of objects describing the outcome of each. allSettled is preferred when you need the results of independent async tasks where one failure shouldn't stop the others.
146. What is the proper strategy for handling errors in asynchronous Node.js code to prevent application crashes?
Uncaught exceptions in Node.js can crash the entire process.
- Async/Await: Wrap code in
try/catchblocks. - Promises: Always attach a
.catch()handler. - Global Handlers: Listen for
process.on('unhandledRejection')andprocess.on('uncaughtException')for logging and graceful shutdown, but rely on them only as a last resort. - Express: Pass errors to the
next(err)function so centralized error-handling middleware can standardize the API response and log the error. - Operational vs Programmer Errors: Distinguish between expected errors (user input) and bugs; restart the process for programmer errors to ensure a clean state.
147. Compare connecting Node.js to SQL versus NoSQL databases, specifically focusing on ORMs versus native drivers.
Node.js connects to databases via TCP/IP using drivers.
SQL (PostgreSQL, MySQL): Structured data. We typically use an ORM (Object-Relational Mapper) like Sequelize, TypeORM, or Prisma to manage schemas and migrations, or a query builder like Knex.js. Connection pooling is critical here to prevent exhausting database connections.
NoSQL (MongoDB): Unstructured/Document data. Often uses Mongoose (ODM) to enforce schemas at the application level, or the native MongoDB driver for raw speed.
Node's non-blocking nature works well with both, but developers must ensure heavy database aggregation queries don't time out the request.
148. When and how should you use Child Processes (spawn, exec, fork) in Node.js?
Child processes are used to execute CPU-intensive tasks or external scripts without blocking the main Event Loop.
exec: Runs a command in a shell and buffers the output (good for small output).spawn: Streams data returned by the process (better for large data/long-running processes).fork: Specifically for spawning new Node.js processes; it establishes an IPC (Inter-Process Communication) channel allowing message passing between parent and child. Use these when you need to perform heavy computation (like video encoding) or run system commands.
Last updated on