Building a Browser MMO: Our Tech Stack and Architecture Decisions
A deep dive into the technical decisions behind building Outer Directive, a persistent 4X space MMO that runs entirely in the browser. Three.js, Node.js, PostgreSQL, Redis, WebSocket, and a 1-second tick engine.
Why We Chose the Browser
When we started building Outer Directive, the first question everyone asked was: why the browser? The honest answer is philosophical as much as it is technical. We believe that downloading a 60GB client, maintaining a gaming PC, and waiting through patch days are all forms of gatekeeping. They filter out players who might love your game but cannot commit the hardware, the bandwidth, or the storage space.
We wanted a deep 4X space MMO where the only requirement is a browser tab. Open a URL, log in, and you are in a universe of 1,900+ star systems with persistent economies, territorial warfare, and alliance politics. No download. No install. No "minimum requirements" page that reads like a hardware shopping list.
That decision shaped every technical choice that followed.
The Rendering Layer: Three.js
We evaluated several options for 3D rendering in the browser. Raw WebGL gives you maximum control but demands an enormous amount of boilerplate for even basic scenes. Babylon.js is powerful but heavy. We landed on Three.js for a few key reasons.
First, the ecosystem. Three.js has the largest community of any WebGL framework, which means better documentation, more examples, and faster answers when we hit edge cases. Second, it gives us fine-grained control over the render pipeline without forcing us to write raw shader code for everything. Third, the library footprint is manageable. We can tree-shake what we do not use and keep the initial load lean.
Our rendering approach is layered. The star map view uses instanced rendering to display thousands of star systems without creating individual draw calls for each one. When you zoom into a system, we load the detailed system view with planetary bodies, stations, and ship positions. At the tactical level during combat, we render ship models, weapon effects, and movement vectors.
The key optimization was accepting that we do not need to compete with Unreal Engine. Our art direction leans into clean, readable visuals rather than photorealism. This keeps frame rates stable across a wide range of hardware, from gaming rigs down to Chromebooks and tablets.
The Server: Node.js and the 1-Second Tick Engine
The server is built on Node.js. We chose it for two pragmatic reasons: JavaScript on both client and server means shared type definitions and validation logic, and the event-driven architecture maps well to our game loop pattern.
At the heart of the server is a 1-second tick engine. Every second, the server processes a game tick that updates the state of the entire universe. Ship movements advance. Production cycles complete. Combat rounds resolve. Resource flows update. Sovereignty timers count down.
This tick-based architecture was a deliberate choice over continuous real-time simulation. A persistent 4X game does not need 60fps server updates. Strategy happens on the scale of seconds, minutes, and hours. The 1-second tick gives us predictable server load, deterministic game state, and the ability to process complex calculations (like physics-based combat formulas across hundreds of simultaneous engagements) without frame-time pressure.
Each tick follows a strict execution order:
- Process player commands queued since the last tick
- Update ship positions and movement vectors
- Resolve combat calculations
- Process production and resource generation
- Update market state and pending transactions
- Check sovereignty and structure timers
- Broadcast state deltas to connected clients
The entire tick must complete within its 1-second window. We monitor tick execution time closely, and if any phase starts approaching its budget, we know exactly where to optimize.
Real-Time Communication: WebSocket
Every connected client maintains a persistent WebSocket connection to the server. We evaluated several options here. HTTP polling was too latent and too wasteful for a game that updates every second. Server-Sent Events are one-directional, and we need bidirectional communication for player commands. WebSocket was the natural fit.
We do not send full game state on every tick. Instead, we send state deltas. If a ship moved, the client receives the new position and velocity vector. If a market price changed, the client receives the updated price. This keeps bandwidth manageable even for players monitoring multiple systems simultaneously.
On the client side, we interpolate between ticks for visual smoothness. When a ship receives a new position at tick N, we animate the transition over the next second so movement appears fluid rather than jerky. The game state is authoritative on the server; the client is purely a rendering and input layer.
The Database Layer: PostgreSQL and Redis
Our persistence layer is split across two systems, each handling what it does best.
PostgreSQL is our source of truth. Player accounts, ship configurations, structure placements, alliance relationships, sovereignty maps, market history, and economic data all live in PostgreSQL. We chose it over alternatives because of its reliability, its support for complex queries (we run monthly economic reports that aggregate millions of transactions), and its JSONB support for flexible schema elements like ship fitting configurations.
Redis handles in-memory state that needs sub-millisecond access. The current positions of all ships in space, active combat engagements, queued player commands, session data, and real-time market order books all live in Redis. When the server processes a tick, it reads the current universe state from Redis, applies all updates, writes the new state back to Redis, and asynchronously persists relevant changes to PostgreSQL.
This split means that a server restart does not lose the game state (Redis persistence plus PostgreSQL backups), but the hot path of the game loop never waits on disk I/O. The tick engine reads and writes entirely against in-memory data.
Handling 1,900 Star Systems
Our universe contains over 1,900 star systems, each with its own planets, stations, asteroid belts, and jump gates. Managing this at scale required some architectural decisions.
We use spatial partitioning to ensure the server only processes active systems in detail. A system with no players present and no active production or combat enters a dormant state where it still advances on a slower schedule (resource regeneration, NPC patrol routes) but does not consume tick budget on detailed simulation.
When a player jumps into a system, it spins up to full simulation fidelity. Production chains process every tick. Market orders match in real time. Combat resolves at full detail. This on-demand activation means we can have a massive universe without paying the computational cost of simulating every corner of it every second.
The star map itself is a graph data structure. Each system is a node, and jump gates are edges. Pathfinding uses a modified A* algorithm that accounts for jump gate distances, sovereignty ownership (friendly vs. hostile space), and player-defined route preferences like "avoid low-security systems."
Challenges We Solved (and Some We Are Still Solving)
Browser memory limits. Unlike a desktop client that can claim gigabytes of RAM, browser tabs are more constrained. We aggressively manage memory by unloading assets for systems the player is not viewing and using object pooling for frequently created and destroyed entities like projectiles and UI elements.
Reconnection handling. Browser connections drop. Tabs get backgrounded. Laptops go to sleep. Our reconnection system snapshots the player's last known state and, on reconnect, sends a compressed state sync that gets the client back to current within one tick cycle. The player's ships and structures continue operating on the server regardless of client connection status.
Cross-device consistency. A player might start a session on their desktop, check on their fleet from a phone during lunch, and issue trade orders from a tablet in the evening. The server-authoritative architecture makes this seamless. There is no client-side state that matters. Everything the client needs comes from the server on connection.
Anti-cheat in an open environment. The browser is inherently an open platform. Players can inspect network traffic, read client-side code, and modify local state. Our defense is simple: the client has no authority. Every action is a request to the server, and the server validates everything. You cannot tell the server your ship teleported across the map because the server tracks where your ship is and how fast it moves.
What We Would Do Differently
If we started over today, we would invest in WebAssembly earlier. Some of our heavier computations (pathfinding across the full star map, combat resolution with dozens of ships) would benefit from WASM performance on the client side for prediction and UI responsiveness.
We would also adopt a more aggressive entity-component-system architecture from day one. Our current object-oriented ship model works, but an ECS approach would give us better cache coherence and make it easier to add new systems without touching existing code.
The Result
Outer Directive proves that "browser game" does not have to mean "simple game." With the right architecture, you can deliver deep, persistent, multiplayer 4X gameplay in a browser tab. The tech stack of Three.js, Node.js, PostgreSQL, Redis, and WebSocket gives us the rendering fidelity, the server performance, the data persistence, and the real-time communication that a game of this scope demands.
We are continuing to push what is possible. If you want to see the result of all these technical decisions in action, explore the features or join our Discord to follow development.