Frontend System Design
Classic Frontend System Design Problems
These three problems appear repeatedly in FAANG frontend interviews. Mastering them gives you patterns that transfer to any frontend design question.
Problem 1: Design an Autocomplete / Typeahead
This is the most common frontend system design question. It tests debouncing, caching, race conditions, and accessibility.
Architecture
+--------------------------------------------------+
| SearchBar |
| +--------------------------------------------+ |
| | <input> [laptop_______________] [X Clear] | |
| +--------------------------------------------+ |
| +--------------------------------------------+ |
| | SuggestionsList (role="listbox") | |
| | ┌────────────────────────────────────────┐ | |
| | │ > laptop stand (highlighted) │ | |
| | │ laptop sleeve │ | |
| | │ laptop charger usb-c │ | |
| | │ laptop backpack │ | |
| | └────────────────────────────────────────┘ | |
| +--------------------------------------------+ |
+--------------------------------------------------+
Key Design Decisions
1. Debouncing input (300ms)
Do not fire an API call on every keystroke. Wait until the user pauses typing:
function useDebounce<T>(value: T, delay: number): T {
const [debouncedValue, setDebouncedValue] = useState(value);
useEffect(() => {
const timer = setTimeout(() => setDebouncedValue(value), delay);
return () => clearTimeout(timer);
}, [value, delay]);
return debouncedValue;
}
function Autocomplete() {
const [query, setQuery] = useState('');
const debouncedQuery = useDebounce(query, 300);
// Only fetch when debouncedQuery changes
const { data: suggestions } = useQuery({
queryKey: ['search', debouncedQuery],
queryFn: () => fetchSuggestions(debouncedQuery),
enabled: debouncedQuery.length >= 2,
});
}
2. Caching results
Cache previous queries so navigating back is instant. TanStack Query handles this automatically with its query cache. For manual implementation:
const cache = new Map<string, string[]>();
async function fetchWithCache(query: string): Promise<string[]> {
if (cache.has(query)) return cache.get(query)!;
const results = await fetch(`/api/suggest?q=${query}`).then(r => r.json());
cache.set(query, results);
// Evict old entries if cache exceeds 100 items
if (cache.size > 100) {
const firstKey = cache.keys().next().value;
cache.delete(firstKey);
}
return results;
}
3. Handling race conditions
If the user types "lap", then quickly "laptop", the "lap" response might arrive after "laptop". Use an AbortController to cancel stale requests:
function useSearchSuggestions(query: string) {
const [results, setResults] = useState<string[]>([]);
useEffect(() => {
if (!query) { setResults([]); return; }
const controller = new AbortController();
fetch(`/api/suggest?q=${query}`, { signal: controller.signal })
.then(res => res.json())
.then(data => setResults(data))
.catch(err => {
if (err.name !== 'AbortError') throw err;
});
// Cancel this request if query changes before it completes
return () => controller.abort();
}, [query]);
return results;
}
4. Keyboard navigation
Users must navigate suggestions without a mouse:
function SuggestionsList({ suggestions, onSelect }) {
const [activeIndex, setActiveIndex] = useState(-1);
function handleKeyDown(e: React.KeyboardEvent) {
switch (e.key) {
case 'ArrowDown':
e.preventDefault();
setActiveIndex(i => Math.min(i + 1, suggestions.length - 1));
break;
case 'ArrowUp':
e.preventDefault();
setActiveIndex(i => Math.max(i - 1, 0));
break;
case 'Enter':
if (activeIndex >= 0) onSelect(suggestions[activeIndex]);
break;
case 'Escape':
setActiveIndex(-1);
break;
}
}
return (
<ul role="listbox" onKeyDown={handleKeyDown}>
{suggestions.map((item, i) => (
<li
key={item}
role="option"
aria-selected={i === activeIndex}
className={i === activeIndex ? 'highlighted' : ''}
>
{item}
</li>
))}
</ul>
);
}
5. Accessibility (ARIA combobox)
The input and suggestion list must form a combobox pattern:
<div role="combobox" aria-expanded={showSuggestions} aria-haspopup="listbox">
<input
role="searchbox"
aria-autocomplete="list"
aria-controls="suggestions-list"
aria-activedescendant={activeIndex >= 0 ? `suggestion-${activeIndex}` : undefined}
/>
<ul id="suggestions-list" role="listbox">
{suggestions.map((item, i) => (
<li id={`suggestion-${i}`} role="option" aria-selected={i === activeIndex}>
{item}
</li>
))}
</ul>
</div>
Problem 2: Design a Chat Application UI
This tests real-time communication, virtualization, and offline handling.
Architecture
+-----------------------------------------------------------+
| ChatApp |
| +--------------+ +-----------------------------------+ |
| | ConvoList | | ChatPanel | |
| | +---------+ | | +-------------------------------+ | |
| | | Alice | | | | MessageList (virtualized) | | |
| | | Bob * | | | | [Alice] Hey! 10:01 | | |
| | | Carol | | | | [You] Hi there 10:02 | | |
| | | | | | | [Alice] Check this.. 10:03 | | |
| | | | | | | ...thousands of messages... | | |
| | +---------+ | | +-------------------------------+ | |
| | | | +-------------------------------+ | |
| | | | | ComposeBar | | |
| | | | | [Type a message... ] [Send] | | |
| | | | +-------------------------------+ | |
| +--------------+ +-----------------------------------+ |
+-----------------------------------------------------------+
Key Design Decisions
1. Message list virtualization
A chat can have tens of thousands of messages. Rendering all of them crashes the browser:
import { useVirtualizer } from '@tanstack/react-virtual';
function MessageList({ messages }: { messages: Message[] }) {
const parentRef = useRef<HTMLDivElement>(null);
const virtualizer = useVirtualizer({
count: messages.length,
getScrollElement: () => parentRef.current,
estimateSize: () => 60, // estimated row height in pixels
overscan: 10, // render 10 extra items above/below
});
return (
<div ref={parentRef} style={{ height: '100%', overflow: 'auto' }}>
<div style={{ height: virtualizer.getTotalSize() }}>
{virtualizer.getVirtualItems().map(virtualRow => (
<div
key={virtualRow.key}
style={{
position: 'absolute',
top: virtualRow.start,
height: virtualRow.size,
}}
>
<MessageBubble message={messages[virtualRow.index]} />
</div>
))}
</div>
</div>
);
}
2. Optimistic updates when sending
Show the message immediately before the server confirms:
function useSendMessage() {
const queryClient = useQueryClient();
return useMutation({
mutationFn: (newMessage: NewMessage) => api.sendMessage(newMessage),
onMutate: async (newMessage) => {
// Cancel outgoing refetches
await queryClient.cancelQueries({ queryKey: ['messages', newMessage.chatId] });
// Snapshot previous value
const previous = queryClient.getQueryData(['messages', newMessage.chatId]);
// Optimistically add the message with a temporary ID
queryClient.setQueryData(['messages', newMessage.chatId], (old: Message[]) => [
...old,
{ ...newMessage, id: `temp-${Date.now()}`, status: 'sending' },
]);
return { previous };
},
onError: (err, newMessage, context) => {
// Roll back on failure
queryClient.setQueryData(['messages', newMessage.chatId], context?.previous);
},
onSettled: (data, err, variables) => {
// Refetch to ensure server state is synced
queryClient.invalidateQueries({ queryKey: ['messages', variables.chatId] });
},
});
}
3. WebSocket connection management
Maintain a persistent connection with automatic reconnection:
class ChatSocket {
private ws: WebSocket | null = null;
private reconnectAttempts = 0;
private maxReconnectAttempts = 5;
private listeners = new Map<string, Set<Function>>();
connect(url: string) {
this.ws = new WebSocket(url);
this.ws.onopen = () => {
this.reconnectAttempts = 0;
};
this.ws.onmessage = (event) => {
const { type, payload } = JSON.parse(event.data);
this.listeners.get(type)?.forEach(cb => cb(payload));
};
this.ws.onclose = () => {
if (this.reconnectAttempts < this.maxReconnectAttempts) {
const delay = Math.min(1000 * 2 ** this.reconnectAttempts, 30000);
setTimeout(() => {
this.reconnectAttempts++;
this.connect(url);
}, delay);
}
};
}
on(event: string, callback: Function) {
if (!this.listeners.has(event)) this.listeners.set(event, new Set());
this.listeners.get(event)!.add(callback);
}
send(type: string, payload: unknown) {
if (this.ws?.readyState === WebSocket.OPEN) {
this.ws.send(JSON.stringify({ type, payload }));
}
}
}
4. Offline message queueing
Queue messages when offline and send them when the connection restores:
class OfflineQueue {
private queue: NewMessage[] = [];
enqueue(message: NewMessage) {
this.queue.push(message);
localStorage.setItem('offlineQueue', JSON.stringify(this.queue));
}
async flush(sendFn: (msg: NewMessage) => Promise<void>) {
const pending = [...this.queue];
this.queue = [];
localStorage.removeItem('offlineQueue');
for (const message of pending) {
await sendFn(message);
}
}
restore() {
const stored = localStorage.getItem('offlineQueue');
if (stored) this.queue = JSON.parse(stored);
}
}
5. Read receipts
Track which messages have been seen using an IntersectionObserver:
function useReadReceipts(chatId: string) {
const observerRef = useRef<IntersectionObserver>();
useEffect(() => {
observerRef.current = new IntersectionObserver(
(entries) => {
const visibleIds = entries
.filter(e => e.isIntersecting)
.map(e => e.target.getAttribute('data-message-id'));
if (visibleIds.length > 0) {
api.markAsRead(chatId, visibleIds);
}
},
{ threshold: 0.5 }
);
return () => observerRef.current?.disconnect();
}, [chatId]);
return observerRef;
}
Problem 3: Design a Collaborative Document Editor
This tests your knowledge of real-time collaboration at a high level. Interviewers do not expect you to implement OT or CRDT from scratch.
Architecture
+------------------------------------------------------------+
| EditorApp |
| +--------------------------------------------------------+ |
| | Toolbar [B] [I] [U] | H1 H2 H3 | [Link] [Image] | |
| +--------------------------------------------------------+ |
| | +----------------------------------------------------+ | |
| | | DocumentCanvas | | |
| | | | | |
| | | The quick brown fox| <-- your cursor (blue) | | |
| | | jumps over the la|zy <-- Alice's cursor (green) | | |
| | | dog. | | |
| | | | | |
| | +----------------------------------------------------+ | |
| | +----------------------------------------------------+ | |
| | | PresenceBar [You (editing)] [Alice (editing)] | | |
| | +----------------------------------------------------+ | |
| +--------------------------------------------------------+ |
+------------------------------------------------------------+
Key Concepts
1. Operational Transform (OT) vs. CRDT
These are the two main approaches to resolving concurrent edits:
| Aspect | OT | CRDT |
|---|---|---|
| How it works | Transforms operations against each other on a central server | Each character has a unique ID; merges are automatic |
| Server | Required (central authority) | Optional (peer-to-peer possible) |
| Used by | Google Docs | Figma, Notion (Yjs library) |
| Complexity | Simpler concept, complex transforms | Complex data structure, simpler merging |
| Offline | Limited (needs server to resolve) | Excellent (merges on reconnect) |
What to say in an interview: "I would use a CRDT library like Yjs or Automerge. These handle conflict resolution automatically and support offline editing. The document state is represented as a CRDT data structure that can be merged deterministically regardless of the order operations arrive."
2. Cursor synchronization
Each user's cursor position must be broadcast to all participants:
interface CursorPosition {
userId: string;
userName: string;
color: string;
index: number; // position in document
selection?: {
anchor: number;
head: number;
};
}
// Broadcast cursor position on change (throttled to 50ms)
function useCursorBroadcast(socket: ChatSocket, userId: string) {
const throttledSend = useMemo(
() => throttle((position: CursorPosition) => {
socket.send('cursor:update', position);
}, 50),
[socket]
);
return throttledSend;
}
3. Conflict resolution UI
When conflicts cannot be auto-resolved, show users what happened:
- Highlight text that was edited concurrently in different colors
- Show a toast: "Alice edited this paragraph while you were offline. Your changes were merged automatically."
- Provide undo for the last merge if the result looks wrong
- Keep a version history so users can revert
Interview tip: For collaborative editor questions, focus on architecture decisions and trade-offs. No interviewer expects you to implement OT or CRDT from scratch. Show awareness of libraries (Yjs, Automerge, Liveblocks) and explain how you would integrate them.
Next, we will cover component library and design system architecture. :::