Web Performance & Accessibility

Advanced Performance Patterns

4 min read

Beyond Core Web Vitals, interviewers expect you to know practical performance optimization techniques. This lesson covers the patterns you will be asked to implement or discuss.

Code Splitting Strategies

Code splitting breaks your JavaScript bundle into smaller chunks loaded on demand. Without it, users download the entire application before seeing anything.

Route-Based Splitting

The most common approach — each route loads its own bundle:

import { lazy, Suspense } from 'react';
import { Routes, Route } from 'react-router-dom';

// Each route is a separate chunk
const Home = lazy(() => import('./pages/Home'));
const Dashboard = lazy(() => import('./pages/Dashboard'));
const Settings = lazy(() => import('./pages/Settings'));

function App() {
  return (
    <Suspense fallback={<LoadingSpinner />}>
      <Routes>
        <Route path="/" element={<Home />} />
        <Route path="/dashboard" element={<Dashboard />} />
        <Route path="/settings" element={<Settings />} />
      </Routes>
    </Suspense>
  );
}

Component-Based Splitting

Split heavy components that are not immediately visible:

import { lazy, Suspense, useState } from 'react';

// Heavy chart library only loads when user opens analytics
const AnalyticsChart = lazy(() => import('./components/AnalyticsChart'));

function Dashboard() {
  const [showChart, setShowChart] = useState(false);

  return (
    <div>
      <h1>Dashboard</h1>
      <button onClick={() => setShowChart(true)}>Show Analytics</button>

      {showChart && (
        <Suspense fallback={<div>Loading chart...</div>}>
          <AnalyticsChart />
        </Suspense>
      )}
    </div>
  );
}

Dynamic import() for Non-React Code

// Load a heavy library only when needed
async function handleExport() {
  const { exportToPDF } = await import('./utils/pdfExport');
  exportToPDF(document.getElementById('report'));
}

// Conditional feature loading
async function initEditor() {
  if (userHasPermission('edit')) {
    const { RichTextEditor } = await import('./components/RichTextEditor');
    mountEditor(RichTextEditor);
  }
}

Image Optimization

Native Lazy Loading

<!-- Browser handles lazy loading natively -->
<img src="/photo.jpg" alt="Description" loading="lazy"
     width="400" height="300">

<!-- Do NOT lazy load above-the-fold images -->
<img src="/hero.jpg" alt="Hero" loading="eager"
     fetchpriority="high" width="1200" height="600">

Responsive Images with srcset

<!-- Browser picks the right size based on viewport -->
<img srcset="/photo-400.jpg 400w,
            /photo-800.jpg 800w,
            /photo-1200.jpg 1200w"
     sizes="(max-width: 600px) 400px,
            (max-width: 1000px) 800px,
            1200px"
     src="/photo-800.jpg"
     alt="Responsive photo">

The <picture> Element for Format Selection

<picture>
  <!-- Browser picks the first supported format -->
  <source srcset="/photo.avif" type="image/avif">
  <source srcset="/photo.webp" type="image/webp">
  <img src="/photo.jpg" alt="Photo with format fallback">
</picture>

Next.js next/image

import Image from 'next/image';

// Automatic optimization: resizing, format conversion, lazy loading
<Image
  src="/hero.jpg"
  alt="Hero image"
  width={1200}
  height={600}
  priority       // Disables lazy loading for LCP images
  placeholder="blur"
  blurDataURL={blurHash}
/>

Resource Hints: Prefetch, Preload, Preconnect

<!-- PRELOAD: High-priority resource needed for current page -->
<!-- Use for fonts, critical images, above-fold CSS -->
<link rel="preload" href="/fonts/main.woff2" as="font"
      type="font/woff2" crossorigin>

<!-- PREFETCH: Low-priority resource needed for future navigation -->
<!-- Use for next-page bundles the user is likely to visit -->
<link rel="prefetch" href="/js/dashboard.chunk.js">

<!-- PRECONNECT: Establish early connection to a third-party origin -->
<!-- Saves DNS lookup + TCP + TLS handshake time -->
<link rel="preconnect" href="https://api.example.com">

<!-- DNS-PREFETCH: Only resolve DNS (lighter than preconnect) -->
<link rel="dns-prefetch" href="https://analytics.example.com">

When to use each:

Hint Priority Use Case
preload High Current page critical resources
prefetch Low Next page resources
preconnect Medium Third-party APIs you will call soon
dns-prefetch Low Third-party origins for analytics, ads

Service Workers and Caching Strategies

Service Workers intercept network requests and can serve cached responses:

Cache-First (Offline-First)

Best for static assets that rarely change:

// service-worker.js
self.addEventListener('fetch', (event) => {
  event.respondWith(
    caches.match(event.request).then((cached) => {
      // Return cached version, fall back to network
      return cached || fetch(event.request).then((response) => {
        const clone = response.clone();
        caches.open('static-v1').then((cache) => {
          cache.put(event.request, clone);
        });
        return response;
      });
    })
  );
});

Network-First

Best for dynamic data that must be fresh:

self.addEventListener('fetch', (event) => {
  event.respondWith(
    fetch(event.request)
      .then((response) => {
        // Update cache with fresh response
        const clone = response.clone();
        caches.open('dynamic-v1').then((cache) => {
          cache.put(event.request, clone);
        });
        return response;
      })
      .catch(() => {
        // Network failed, serve from cache
        return caches.match(event.request);
      })
  );
});

Stale-While-Revalidate

Best balance of speed and freshness — serve cached, update in background:

self.addEventListener('fetch', (event) => {
  event.respondWith(
    caches.open('swr-v1').then((cache) => {
      return cache.match(event.request).then((cached) => {
        const fetchPromise = fetch(event.request).then((response) => {
          cache.put(event.request, response.clone());
          return response;
        });
        // Return cached immediately, update cache in background
        return cached || fetchPromise;
      });
    })
  );
});

Bundle Analysis

Identifying Large Dependencies

# Webpack bundle analyzer
npx webpack-bundle-analyzer stats.json

# Next.js built-in analyzer
# Install: npm install @next/bundle-analyzer
# next.config.js:
# const withBundleAnalyzer = require('@next/bundle-analyzer')({ enabled: true });
# module.exports = withBundleAnalyzer(nextConfig);

Common oversized dependencies and alternatives:

Heavy Library Size Lightweight Alternative Size
moment.js ~300KB date-fns (tree-shakeable) ~10KB used
lodash ~70KB lodash-es + tree-shaking ~5KB used
Full chart.js ~200KB Lazy load on interaction 0KB initial

When Memoization Helps (and When It Hurts)

This is a favorite interview topic. Many candidates overuse React.memo, useMemo, and useCallback.

The Cost of Memoization

Every memoization has overhead:

  1. Memory: storing the previous value and dependencies
  2. Comparison: React must compare dependencies on every render
  3. Complexity: more code to read, maintain, and debug

When to Use useMemo

// GOOD: Expensive calculation that would slow down every render
function ProductList({ products, filter }) {
  const filtered = useMemo(() => {
    // Filtering 10,000 products is expensive
    return products.filter(p =>
      p.name.toLowerCase().includes(filter.toLowerCase()) &&
      p.price >= filter.minPrice &&
      p.category === filter.category
    );
  }, [products, filter]);

  return <ul>{filtered.map(p => <ProductCard key={p.id} product={p} />)}</ul>;
}

// BAD: Trivial calculation — the comparison costs more than the computation
function UserGreeting({ name }) {
  // Do NOT memoize this — string concatenation is near-instant
  const greeting = useMemo(() => `Hello, ${name}!`, [name]);
  return <h1>{greeting}</h1>;
}

When to Use useCallback

// GOOD: Handler passed to a memoized child that checks referential equality
const MemoizedList = React.memo(({ items, onItemClick }) => (
  <ul>{items.map(item => (
    <li key={item.id} onClick={() => onItemClick(item.id)}>{item.name}</li>
  ))}</ul>
));

function Parent({ items }) {
  // Without useCallback, MemoizedList re-renders every time Parent renders
  // because onItemClick would be a new function reference each time
  const handleClick = useCallback((id) => {
    console.log('Clicked:', id);
  }, []);

  return <MemoizedList items={items} onItemClick={handleClick} />;
}

// BAD: No memoized child — useCallback adds overhead with zero benefit
function SearchForm() {
  const [query, setQuery] = useState('');

  // This useCallback is pointless — the input is not memoized
  const handleChange = useCallback((e) => {
    setQuery(e.target.value);
  }, []);

  return <input value={query} onChange={handleChange} />;
}

The Decision Rule

Only memoize when all three conditions are true:

  1. You have measured a performance problem (React DevTools Profiler)
  2. The computation is genuinely expensive OR referential equality matters for a memoized child
  3. The dependencies change less often than the component renders

Interview tip: If asked "When would you use useMemo?", start with "I would first measure with React DevTools Profiler to confirm there is a problem" — this shows maturity.

Next, we will cover WCAG 2.2 accessibility standards and testing techniques. :::

Quiz

Module 5: Web Performance & Accessibility

Take Quiz